In my last post (Becoming Data Driven), we began a discussion about the goal of becoming a data driven organization. We determined it’s not so much about the tools as it is about leadership, philosophy, and decision processes of a company that help to reach a data-driven state.

If you are data driven then your analytic tools and insights are helping you drive another dollar of revenue, reduce another dollar of expense, find ways to do more with less, and secure your future against disruption.

As part of the post, we introduced the Business Threat Assessment (BTA), a mechanism used by business leaders interested in being more data driven. The outcomes of this assessment are a list of three tactical threats, three strategic threats, and the three most persistent challenges to an operation’s efficiency. It’s a way to get organized by establishing a meaningful priority set that should be evergreen. The BTA is as much a way of thinking as it is a process, and it should be scalable up and down your team. Your managers should be able to weigh in with their interpretation as it relates to their areas of responsibility.

In this post, we will begin to leverage the outputs of your Business Threat Assessment by introducing how to leverage the first level of analytics – descriptive analytics – to better align your data to your fundamental business goals.

Descriptive analytics encompasses how your current and historical data sets are produced, manipulated, and displayed (as compared to predictive, or forward-looking tools). Here are the foundational elements a company should include, in order of complexity, to develop a successful descriptive analytics solution:

  1. The Data Assets – All the data generated and captured in your operation, your company’s ocean of data, so to speak. In my first RUMBLE blog post (The Data Map: The Road to Managing Data as an Asset), we talked about creating a data map that inventoried these assets and this is a necessary first step for any company to pursue.
  2. The Data Management Infrastructure – The various repositories where you are storing data today – it might be organized, partly organized or not organized at all.
  3. The Data Presentation Layer – Your existing reports. 
  4. A Data Model – An overlay that ties all your data elements together, defines their types and values, and illuminates the relationship and sequencing between them. This can be as-built, a reflection of what has grown over time, or optimized (detailed later).
  5. A Data Archive – An advancement over “run of the mill” data management and storage, a Data Archive constitutes a designed repository and database infrastructure that typically integrates and organizes your data elements into an efficient structure that is more easily accessed and manipulated for reporting and analytics.
  6. A Business Intelligence Tool – These tool kits (Tableau, Qlikview, Ateryx and Microsoft Power BI) optimize the visual display of data and reporting by integrating user configurable dashboards, reporting schedulers, and distribution and publication functionalities.
  7. An Analytic Data Set and Toolkit – A specialized data repository created by your analytics’ team that is populated by the critical data element subsets most relevant to your analytic requirements. This advanced approach has been statistically validated through exploratory data analysis as the most useful subset for query and investigation.

In short, descriptive analytics manipulates your current and historical data assets we’ve listed above (including your reports) to make more effective business decisions possible.

Descriptive Analytics Applied

The goal of increasing efficiency in a systematic matter is a commitment to the philosophy and process of continuous improvement. Measure, analyze, respond, act, measure, analyze, respond. Repeat.

Common initiatives that fall under the applied descriptive analytics proven to increase efficiency include:

  • Reports Audit and Data Model Optimization
  • KPI Review and Testing
  • Attribution Analysis

In this blog, we’ll start with a more in-depth analysis of the first example.

  1. The Reports Audit and Data Model Creation or Optimization

Real-life Example: A client had decided to implement a new Business Intelligent (BI) tool and requested our help migrating the reporting infrastructure of their investment accounting team. We discovered a reporting infrastructure of over 200 spreadsheet reports that had accrued over the previous decade. Each was hand-built, hand-operated and tied to critical processes of month-end and quarter-end close cycles.

There was no data model, and data feeds driving these reports came from over 50 discrete sources. We conducted a reporting audit and found that many of the report elements overlapped. We were able to create a data model that established the key sources and data elements, including their relationships and location, and which populated a data repository we built for the BI tool. The results were a 50% reduction in the number of reports with an equivalent hours saved of 1.5 FTE headcount. 

Most companies can benefit from a process like this. Understanding the type, frequency, and audience of all the reports you produce allows you to establish control and impose efficiency where it may not be currently be present. If you can’t point to a data model that classifies and organizes your data elements, you can’t control the evolution of report creep. A good place to start is to consciously review the time, dollars and staff you have supporting your current reporting structure. This investment in TIME should provide you an ROI urgency to ensure a data map is created and optimized.

In my next blog I’ll introduce two additional applied diagnostic analytics examples: KPI Review and Testing and Attribution Analysis. Both will include real-life examples of the how organizations benefited from their use.

(Article by Chris Schultz, a principal at Analytic Marketing Innovations (AMI) and a RUMBLE Strategic Partner. Their solutions delivery approach identifies executable steps and recommends both near-term and long-term courses of actions, helping your business leverage data insights for growth and transformation.)

RUMBLE strategic partner, Cradlepoint, recently sponsored a white paper (A Sensible Approach to Smart City Projects) showing the most successful smart city initiatives have been those that started small.

“Some companies will come and try to create a global dashboard that tries to take data from every part of a city and analyze it,” said Ken Hosac, vice president of IoT strategy and business development, Cradlepoint. “The approach we’ve been taking is to find a specific targeted use case and make that use case better, or enable a use case that didn’t exist before.”

This more “sensible approach” of refining one service at a time instead of undertaking a massive project upfront has overcome the hesitancy of city managers taking advantage of smart city technologies.

The White Paper explores various facets of smart city initiatives, including:

  • Examples of smart city technologies
  • The potential benefits of smart cities to enhance the quality of living for its citizens
  • Common and pragmatic first steps in embarking upon smart city initiatives
  • Smart city pitfalls to avoid

Cradlepoint is as strategic partner of RUMBLE and the global leader in cloud-delivered wireless edge solutions for branch, mobile, and IoT networks. RUMBLE improves business performance by designing and deploying custom end-to-end IoT solutions with ROI built-in. To discuss your specific smart city challenge, click Ready to Rumble.

In my last blog post (The Data Map—The Road to Managing Data as an Asset), I provided advice on how an organization can go about building a data map in order to establish a baseline of the data assets underlying their enterprise. Now, before we look at organizing those assets for analytic purposes, let’s talk about a different challenge that is foundational to everything else—becoming data driven. You probably hear this phrase a lot but may not necessarily see a lot of useful discussion about what it means, in practical terms. 

Let me start with my own personal core belief about this. Becoming data driven has a lot LESS to do with software, tools, BI, analytic techniques and wonder weapons, and a lot MORE to do with your organization’s culture, leadership, and philosophy about decision making and decision support.

Things that do not make your company data driven:

  • Buying an expensive business intelligence tool and the necessary usurious licenses and installing it on a multitude of desktops.
  • Hiring data scientists and creating CDO titles.
  • Creating a data mart, warehouse, or lake and accumulating metric tons of data.

Most of these things may have a place in a company’s evolution toward being data driven, and they may be necessary (at some point), but they are definitively not sufficient (at any point) to drive the transformation. Put another way, in my years as an analytics consultant, I have worked with many smaller companies whose toolkits were basically just Excel based and were far, far more data driven than giant enterprises that had spent millions on the items illustrated above and were failing to address the core changes necessary to actually utilize them.

Being data driven means having what fighter pilots call maximum situational awareness— striving for near perfect clarity on the state of your operation, and relentlessly seeking highly informed insight into what is likely to come in the near, intermediate, and longer-term future.

In the absence of leadership making the difficult changes to their operational processes, companies don’t fully utilize the capabilities these toolkits deliver in a data driven manner. You fundamentally have to a) trust the data and b) be willing to have the courage of your convictions to drive the outcomes the analytics illuminate. Those convictions are often torpedoed by leadership-centric issues of politics, expediency, procrastination, or cults of personality. I have worked for companies where that list constitutes the entire operational methodology. We laugh, but everybody reading this knows it’s true, and probably sees some of it at their own company every day. If you’re a leader, and your reaction is “not at my company,” well, good luck!

Let’s not kid ourselves. These are very common problems, to a greater or lesser degree, at many companies. The kinds of organizational behaviors and dysfunctions are the biggest barrier to becoming data driven, not the lack of shiny tools and cool software.

So, this begs the question—what is the CEO, CMO or COO who is truly committed to making this happen supposed to do?

At its very core, becoming data driven means being fact-driven. It means making more efficient, informed decisions. It means having what fighter pilots call maximum situational awareness—striving for near perfect clarity on the state of your operation, and relentlessly seeking highly informed insight into what is likely to come in the near, intermediate, and longer-term future. It means embracing measurement and celebrating the results—both good and bad. Those qualifiers, by the way, are probably holding you back right now. What you want, as a manager, is accuracy—and if you want to get your people in the habit of thinking that way, you should be substituting “accurate” and “inaccurate” as your key descriptors for your numbers. Don’t punish people, at all costs, for bringing you numbers or analysis that you don’t like—if it is accurate. Reward honesty in measurement, regardless of the relative interpretation.

If you are committed to reaching this goal of becoming data driven, then you are likely going to take your company on a journey through the three levels of analytics:

Descriptive Analytics: Focused on maximizing the utility of the datasets generated by your current operation, supplemented with other data sources, to maximize efficiency.

Predictive Analytics: Deploying tools that will allow your operation to anticipate customer needs and to model forecasts and scenarios of possible business scenarios (product launches, for example).

Prescriptive Analytics: Currently much debated in definition, but grounded in the implementation of advanced AI and machine learning techniques to address complex, multi-variate questions. Characterized by a state of maximum automation, it can be thought of as the point where smart machines begin to manage much of the operational decision making in an enterprise.

So, how to begin that journey?

Issue an RFP for an advanced BI tool, right? WRONG! You have homework to do, my friend. Developing the roadmap that will eventually guide you through this journey means coming back to the core of what being data driven means—understanding the current state of your operation, and what the priorities are for you to achieve.

  1. Drive revenue and growth
  2. Reduce expense and grow margin
  3. Increase the efficiency of the operation (in many cases, cost avoidance rather than cost reduction)

To make these things happen, in a data-driven way, look at what barriers are blocking progress across these strategic goals. I advise a company to start with a very straightforward exercise—the Business Threat Assessment.

Becoming Data Driven: Step 1—The Business Threat Assessment

This is the foundational step to all that follows. It establishes the priorities that analytic solutions need to address, and it’s entirely in the control of the company to achieve. The company needs to answer three fundamental questions:

  1. What are the three greatest tactical (next 1 to 2-year horizon) threats to the operation’s success?
  2. What are the three greatest strategic (next 3 to 5-year horizon) threats to the operation?
  3. What are the three greatest, persistent operational issues the company seems to face, year after year?

Some words of advice about this: If, upon reading this list, your first impulse is to reach for the phone and call a highpowered (expensive) business consultant to come in and execute this, you’re already off the rails. This is an exercise that any company should be able to accomplish without any external help—and if you can’t, you have bigger problems than analytics can fix. Get the bright leaders in your company to spend a day on this. And, if your feeling is “I can’t trust this to be done right,” then do pick up the phone, call a recruiter and get on top of your real issue.

Let’s be honest. Anybody in a leadership position should have a pretty good idea of the answer to these questions. If you’re not talking about them today, in a regular fashion, then your first step on the way to becoming data driven is to institutionalize this list, refresh it on a monthly basis, and focus the leadership team on addressing it.

Why is this first step necessary?

Because being data driven means committing to a process of continuous improvement. As leaders and managers, you are prioritizing your people, their assets, and their efforts. If you’re not clear on the size, pressure and importance of the challenges facing the enterprise, you’re not able to task anybody effectively. You should be developing plans that deliver the maximum return to the business along the three key metrics we’ve discussed (growth/cost/efficiency), and those plans are NOT one and done. They are a continuous, reinforced, optimized set of decisions that are consciously selected to deliver maximum, measurable return.

TO BE CONTINUED…

In our next post, we’ll talk about how to take the outputs of the BTA, and do the exercise of asking the question, “How can my current data assets and KPIs help me address these challenges?” We’ll be in the land of descriptive analytics and talk about taking a hard look at your current reporting infrastructure before you spend a dollar to change it.

 

(Article by Chris Schultz, a principal at Analytic Marketing Innovations (AMI) and a RUMBLE Strategic Partner. Their solutions delivery approach identifies executable steps and recommends both near-term and long-term courses of actions, helping your business leverage data insights for growth and transformation.)

“IoT should stand for the Insurance of Things because it helps companies avoid disaster, whether its predicting when a piece of machinery is about to fail or helping farmers better track the health of their livestock to prevent disease outbreaks.” – Perry Lea, RUMBLE co-founder

 

The Kansas City Business Journal talked with RUMBLE co-founders Terri Foudray and Perry Lea recently on the challenges aspiring businesses face in implementing IoT technologies (OP Tech Firm Talks IoT Business Challenges, Tips for Overcoming Them, staff writer Leslie Collins, KCBJ, February 19, 2019).

Terri and Perry provided four tips for businesses to help overcome these challenges:

  1. Identify the Company’s Technology Strengths and Weaknesses – Developing and implementing a scalable IoT solution is complex and most company’s need to supplement their internal team with outside experts to create a well-rounded solution.
  2. Prioritize What is Connected – Companies should adopt IoT solutions that add value and not just connect something because it can be connected. “Instead of ‘biting the whole apple’, start with the low-hanging fruit that will have an immediate impact,”  said Terri Foudray, RUMBLE CEO.
  3. Be Prepared for Cultural Change – Implementing IoT solutions require previously siloed teams within a company to work together, like IT and OT departments. “If you don’t have leadership of the company behind the initiative, it’s likely not going to be effective,”  said Foudray.
  4. Find Your Team Quarterback – Partner with an experienced company to help you lead the effort and remove complexities. “…look for people who have actually deployed IoT solutions, and you will save yourself a lot of headache and wasted time and resources,” added Foudray.

For more insights on the importance of developing a relationship with an end-to-end IoT solutions provider, visit RUMBLE.

“The ability to interconnect things, services and people via the Internet, improves data analysis, increase productivity, enhances reliability, saves energy and costs, and generates new revenue opportunities through innovative business models.” – A. Gosine, author.

An article by author A. Gosine featured in IIoT World (Water/Wastewater Utilities Leveraging IIoT) provides an interesting review on how connected machines and devices are reshaping the way utilities operate, allowing them to make smarter decisions.

Gosine lists seven existing ways IIoT is positively effecting water management at water and waste water plants:

  1. Water Leakage Protection. IIoT technologies increase energy efficiency by correlating energy patterns to production and process variables. This results in new, dynamic data analytics and real-time system monitoring to more quickly detect water loss through leaks or theft, as well as monitor water pressure fluctuations.
  2. Systemic Water Management Efficiencies. The availability of less expensive water sensors track water quality, temperature, pressure, consumption and more to better understand how consumption compares to city averages, industry averages and previous months. The data generated by the sensors is communicated with the utility in easy-to-understand formats, ready for sharing with consumers if so desired.
  3. Water Quality and Safety Monitoring. A smart water quality management approach monitors almost every measurable parameter – chlorophyll, air temperature, turbidity, dissolved oxygen concentration, oxidation-reduction potential, pH, relative humidity, chlorine concentrations, the presence of organic compounds, and electrical conductivity – in real-time and act upon it to certify drinking-water quality.
  4. Wholesale/Retail Water Consumption Transparency. Utilities can use IIoT to respond to fluctuations in energy cost or new compliance requirements. New modeling capabilities of performance in support of energy budgeting and contract negotiations are also made possible.
  5. Infrastructure Prescriptive Maintenance. Predictive and preventive maintenance and anomaly detection is dramatically enhanced through the implementation of an IIoT platform. One example discussed is the monitoring of pump system performance. Many pumps typically operate below their best efficiency point (BEP), resulting in excess energy being transmitted into vibration, heat and noise – all which increases maintenance and energy costs.
  6. Industry Consolidation and Best Practices. IIoT technologies are revolutionizing industry best practices. Sharing of data between industrial businesses find synergies within product portfolios, redefining what is possible in energy savings, increased equipment lifetime and maintenance cost reductions.
  7. Manufacturing Innovation Incentives. IIoT also spurs manufacturers to “rethink” their role in supporting water utilities by investigating and developing comprehensive new solutions like “smart pumping systems.”

RUMBLE exists to help water and wastewater utilities design custom architecture solutions to deliver on the potential of IIoT by working closely with operational and informational technology personnel. Our approach is simple. We start with your business goal – greater efficiency, scalability, predictability, automation or engagement. Then, we leverage our unique experience and technical insights to build a plan that not only delivers on your vision, but is measurable, secure and delivers a ROI.  Let’s start with a conversation.