In my previous post (The First Level of Analytics for the Data Driven Company), we defined the basic building blocks of descriptive analytics and introduced the first of three applied descriptive analytic examples to help drive value for your company and make your decision making better informed and more data driven.

The goal of increasing efficiency in a systematic matter, fundamentally, is a commitment to the philosophy and process of continuous improvement. Measure, analyze, respond, act, measure, analyze, respond. Repeat.

The first applied descriptive analytic we reviewed last post was The Reports Audit and Data Model Creation or Optimization where we focused on how “report creep” can saddle your organization with a costly drag on employee productivity. By establishing a data model that classifies and organizes your data elements, you can realize huge gains in FTE hours saved in your organization’s reporting infrastructure.

Here are two additional applied descriptive analytic examples for your review:

Applied Descriptive Analytic #2: Key Performance Indicator (KPI) Review and Testing

Real-life Example: We recently helped a company that provided social services to various states, primarily placing at-risk kids into home solutions (adoption, foster, family, etc.). We were discussing how analytics could help increase their efficiency and one of the clients voiced a particular problem they faced.

Periodically, they had to provide to the states they contracted with a summary of where the kids were in the system at a specific point of time (i.e. just entering, placed in care, receiving counseling or other services, existing the system). We were astounded to hear that it took two to three weeks to assemble the information needed to populate this report. Since the company depended on these state contracts, this report qualified as a KPI and was both essential and strategic for their future growth.

Think about this example – the company produced hundreds of reports to internal and external audiences but had to manually gather data to answer what many of us on the outside world would expect to be a reasonable and even common information request. These folks were simply not tracking the right KPIs.

Your organization’s KPIs should be testable – they should be able to demonstrate that they are predictive of the outcomes you’re trying to achieve. One common problem we see are KPIs created from the bottom up – in other words because a given metric is available on a report, it becomes de facto selected.  Useful KPIs should be created from the top-down. The executive and leadership team should be responsible for determining the strategic definitions  of success, and the metrics defining these conditions should be either selected or created. Again, having a defined data model gives you a leg up as the analytics’ team can test to confirm the metrics are statistically verifiable in terms of the outcomes you’ve defined as positive.

One last comment – KPIs are never fixed. They, like everything we’re discussing, are subject to a process of continuous improvement. We’ve worked with companies that quarterly review their KPIs and question what has changed in the make-up of the company since the last financial period? Do the critical KPIs still reflect the desirable strategic goals. If you’re not asking these questions, over time your KPIs will drift, and you’ll find yourself wasting time, money and hours monitoring the wrong metrics.

Applied Description Analytic #3: Attribution Analysis

Real-life Example: We worked with a client in Hong Kong that was spending millions of dollars on multiple on-line and off-line channels to sell a product. They wanted to know if they were allocating their funds to optimize their growth. We worked with their digital marketing team to examine the reporting streams associated with each of the digital channels and built an attribution model to measure contribution.

What we found in the data was interesting. The digital team was seeing a much higher number of arrivals to their website from several social media channels compared to more direct digital marketing channels. The contribution of social to conversion was also higher than the direct channel options, even though the latter cost more. But why?

By examining the specific content in the digital channels using various listening platforms, we discovered a fascinating trend: the social chatter revolved around people discussing a series of advertisements the institution had placed on billboards affixed to city buses. The selection of male and female models and the clothes and accessories they were wearing had caught their attention which translated to sharing, more pull-through visits to their website and sales of the product.

In this example, the attribution model not only showed that the cheapest digital channel was contributing to more conversions, further analysis of the results uncovered the root cause of the increased conversions was due to an inexpensive series of traditional ads. And yes, the featured male and female models got more work! Thanks to the power of analytic attribution modeling. The client also benefited as they increased their marketing efficiency by realigning their spend to take advantage of this information.

These are just a couple more examples of how applied descriptive analytics can help drive value for your company and make your decision-making better informed and more data-driven. In our next blog series, we’ll shift gears and look into the realm of Predictive Analytics and how data builds on data as companies increase their depth and maturity in applying analytics to their business.

(Article by Chris Schultz, a principal at Analytic Marketing Innovations (AMI) and a RUMBLE Strategic Partner. Their solutions delivery approach identifies executable steps and recommends both near-term and long-term courses of actions, helping your business leverage data insights for growth and transformation.)

RUMBLE strategic partner, Cradlepoint, recently sponsored a white paper (A Sensible Approach to Smart City Projects) showing the most successful smart city initiatives have been those that started small.

“Some companies will come and try to create a global dashboard that tries to take data from every part of a city and analyze it,” said Ken Hosac, vice president of IoT strategy and business development, Cradlepoint. “The approach we’ve been taking is to find a specific targeted use case and make that use case better, or enable a use case that didn’t exist before.”

This more “sensible approach” of refining one service at a time instead of undertaking a massive project upfront has overcome the hesitancy of city managers taking advantage of smart city technologies.

The White Paper explores various facets of smart city initiatives, including:

  • Examples of smart city technologies
  • The potential benefits of smart cities to enhance the quality of living for its citizens
  • Common and pragmatic first steps in embarking upon smart city initiatives
  • Smart city pitfalls to avoid

Cradlepoint is as strategic partner of RUMBLE and the global leader in cloud-delivered wireless edge solutions for branch, mobile, and IoT networks. RUMBLE improves business performance by designing and deploying custom end-to-end IoT solutions with ROI built-in. To discuss your specific smart city challenge, click Ready to Rumble.