What Matters Now: Data Analytics - SoDA - The Digital Society

SocietyOfDigitalAgencies

SoDA Blog Leadership Ideas and Opinions

« Back to Blog

What Matters Now: Data Analytics

by John Gibs, David Rossitter, Ken Allard, Marissa Gluck, Huge / May 14, 2014

Note: This blog entry is available in English only.

The top trends making Big Data more than just marketing buzzwords.

People throw the term “Big Data” around frequently, but few can define it precisely. It typically refers to large-scale customer data sets that grow quickly and need advanced analytics tools to access and parse. Marketers need to better understand what questions can be asked and answered with Big Data or else it becomes just another fad.

Examples of those questions are concrete and diverse across an array of businesses and functions. In retail, Big Data facilitates not only better customer personalization but also data-driven pricing, more precise Customer Relationship Management (CRM) and more efficient ad spending. In manufacturing, factories monitor their physical equipment to precisely target the optimal time to repair or replace it, saving money.

Financial services companies use Big Data to detect fraud and develop finely-tuned financial instruments and sophisticated offers. In law enforcement, police use it to predict potential crime hotspots based on historical data rather than gut instinct. Cities have started Big Data initiatives to optimize the flow of traffic, waste and power consumption.

In healthcare, hospitals are examining their patient records to identify those that are likely to need readmission after discharge in order to intervene early enough to prevent another expensive stay. Consumers use apps and wearable technology to quantify things like caloric intake, exercise and sleep to improve health.

Big Data has the potential to revolutionize businesses, improve decision-making and create innovative revenue models. The increasing velocity, volume and variety of data offer new opportunities to organizations but also present a challenge to businesses ill equipped to deal with it. Is Big Data just Big Hype? Here are the key trends shaping the evolution of data analytics and business intelligence today:

More and more companies unlock the value of information.

Businesses have traditionally viewed data as a byproduct of other processes, such as manufacturing, distribution or sales. Innovative organizations are quickly realizing that data itself is a key component of products. While there are opportunities to create new revenue streams from the glut of data created, few companies are taking advantage of them today.

According to Gartner, only ten percent of businesses today are monetizing their information assets. Yet that number is expected to rise to 30 percent by 2016. More organizations will develop ventures that package and sell their publicly available data, launching new insight-based products. Organizations that don’t will risk losing potential revenue to startups and competitors.

One example of a company that created a new business model based entirely on publicly available data and a proprietary algorithm is the Climate Corporation, founded by former Google executives. The company, which sells farmers crop insurance, has gone well beyond the typical insurance model to offer hyperlocal weather forecasts for individual crop fields with projected growth stages, soil moisture tracking, and tools to optimize yields. Monsanto recently acquired the company for nearly $1 billion.

Key Takeaway: Along with the rise of third-party information resellers that can help companies execute their data monetization strategies, internal data product managers will emerge and lead efforts to create new revenue streams from existing data.

The first step in monetizing data is conducting an internal audit to determine the breadth and depth of in-house data and appraise its value. Companies need to understand the value of their data along two axes—how unique it is relative to the overall marketplace (and the company’s reach in that market) and the total number of customers the data applies to. If a company finds its data scores relatively high along the two axes, it is better positioned to create new data products.

From describing and diagnosing to predicting and prescribing.

Beyond realizing that data is a monetizable asset, organizations are beginning to evolve their analytic processes and discovery. Companies have begun to shift from descriptive analytics (what happened) and diagnostic analytics (why it happened) to predictive analytics (what will happen).

The biggest stumbling block to predictive analytics today is the lack of integrated, customer-level data. A recent study by Acxiom and Digiday found that 74 percent of companies surveyed are unable to recognize customers in real time. As companies improve integration efforts over time, though, they will be able to do this kind of analysis. Organizations will not only achieve predictive capabilities, but will gain prescriptive abilities as well.

Some already have. Flash sales service Gilt, for example, has gone beyond the “personalized” recommendations many retailers offer as a sidebar with “Your Personal Sale.” The algorithm-powered page presents sale items based on the user’s purchase history, clothing size, favorite brands, browsing behavior and geographic location.

Key Takeaway: “How can we make it happen?” will become even more important to companies than being able to describe what happened or why. Efforts to improve data integration are key to achieving analytic maturity. Companies that are not already in the process of integrating their data towards this end are already at least six months behind the curve.

Small companies might not need Big Data: Right sizing is key.

For some organizations, a major investment in data infrastructure and talent may not be worth the incremental gains it can achieve. As all business becomes digital, every company needs analytics resources to some degree. Yet advanced analytics typically demands expensive, specialized software and vast computing power, along with a massive amount of data to analyze in the first place.

The challenge is figuring out what is required. Big organizations should have a centralized senior executive overseeing multiple teams to ensure collaboration and knowledge sharing. But smaller firms may not need that kind of complex, hierarchical structure. Nor will heavy spending on advanced analytics necessarily result in revenue gains. In addition, smaller businesses may already know how to anticipate future demand without investing heavily in expensive predictive modeling tools.

Such an investment is not just attractive but necessary for big organizations, however, due to the wealth of insights to be gleaned from advanced analytics. Research conducted last year by the Massachusetts Institute of Technology and McKinsey demonstrated that companies that incorporate advanced analytics into their operations outperform their competitors by five percent in productivity and six percent in profitability.

Key Takeaway: All businesses must understand basic interactions between users and their digital assets. Beyond that, organizations need to determine how much they are willing to invest in analytics resources based on their size, scale and mission. Even the smallest organizations should evaluate their data assets for potential value—sometimes the upside of analytics relates more to specific organizational goals rather than company size.

Analytics tools become more user-friendly.

To date, analytics has remained a specialized function, partly because of the limitations of many analytics tools and platforms. Most require analysts to have specialized statistical skills, build their own mathematical models, and know what data they need. Additionally, most have weak visualization tools. The majority of marketers simply don’t possess these skills today.

And so businesses are demanding more user-friendly tools. Natural language queries, better data discovery tools, and intuitive visualizations will bring complex analytics out of the hands of the few. Computing power is no longer a barrier as cloud-based systems allow organizations to scale large volumes of data. Easier-to-use data tools will allow more transparent assessments across an organization.  Pervasive data and analytics tools also mean improved collaboration among users.

Data visualization tools will become more flexible too, structuring analysis in ways that are meaningful to different audiences. Just as retailers offer customers suggestions based on what similar users have bought in the past, analytics tools will offer guidelines based on the decision history of other users and their previous queries.

Key Takeaway: An increase in user-friendly analytics tools and technology can empower employees throughout an organization. Ensuring that Key Performance Indicators (KPIs) are not only established but also shared, combined with easier-to-use data tools, helps stakeholders across the organization work towards the same goals.

Better usability also shifts perceptions of analytics overall, helping it move away from being a tools-based system towards becoming an insights-based system. As more people within the organization gain access to data, the role of the analytics team shifts from mere reporting to providing deeper insights.

Data=Better UX.

Traditionally, user experience researchers have used qualitative tools, such as focus groups, customer interviews or iterative prototyping. Missing from these efforts is a quantitative approach that can lend insight into the customer’s pain points and obstacles.

A simple way to incorporate data into user experience is looking at Google search terms to see where there are gaps between common visitor queries and what the digital experience can actually deliver. Other tools include A/B testing (also sometimes called “split testing”) and multivariate testing. A/B testing takes a single page element on a web site at a time and uses two different versions to gauge which one is more effective. Similarly, multivariate testing does the same but includes multiple page elements simultaneously.

Most of the tools available today help developers design for the average user. Yet the tools that are most beneficial are the ones that help optimize for the most valuable customers. For most organizations, the 80/20 rule still applies, where the top 20 percent of customers are most valuable to the company. Most existing analytics tools, in focusing on the average user, stay centered on the less valuable 80 percent, when they should be optimized to deliver great experiences to the more valuable 20 percent.

Beyond the usability testing described above, smart organizations are also using more advanced sources of data such as website analytics, customer satisfaction data and call center feedback as potential inputs. Marrying more advanced sources of data with user-centered design results in a better experience, with more concrete insights and recommendations.

Key Takeaway: While qualitative insights have always played an important role in shaping user experiences, quantitative data will become increasingly critical to developing better insight into the customer’s journey, wants and needs.

More analytics talent emerges.

While analytics tools will improve and become more integrated into the daily decision-making of non-specialists, there remains a scarcity of business intelligence talent. This may be one of the most entrenched challenges to improving data analytics.

Currently, statisticians and modelers gravitate towards better paying jobs in industries such as adtech, government and enterprise technology (think IBM). Business schools are increasingly offering concentrations in analytics, training the next generation of business intelligence experts. Yet today marketers still struggle to fill analytics roles.

As data moves closer to the core of an organization’s infrastructure and mission, two main trends will drive an increase in available analytics talent. First, salaries will improve even if they don’t achieve parity with some other career paths. Additionally, the elevation of data within a business offers specialists other benefits such as greater intellectual challenges. There will also be an upswing in “softer” cultural benefits like greater esteem, peer respect and a wider berth to influence the direction of an organization.

Key Takeaway: Organizations that move data closer to the core of their business will have a competitive advantage in attracting analytics talent. Even as there are a greater number of analysts entering the marketplace, companies need to not just offer competitive salaries, but also the type of complex problem-solving opportunities that attract great talent.

Additionally, the kind of skills that organizations need will shift. While technical experts are important, companies should looks for the kinds of skills that can be migrated to business analytics from other roles, such as actuaries. In the future, the growth of analytics teams will depend more on talent and experience than on technical skill.

Better information management practices begin to take hold.

The increase in available data means that organizations need to improve data governance. The first step towards strong data governance is ensuring every stakeholder is using a common language. A shared vocabulary ensures that everyone is speaking the same language across platforms.

Data governance starts with empowering a single governing body to designate controlled vocabularies for specific data points. This gatekeeper controls access to the vocabularies and ensures all terms in them represent concepts that all parties can define and use in a consistent fashion. It also vets any subsequent changes prior to implementation.

An example is a company that has multiple divisions, each with its own terminology for categorizing a marketing campaign. When the parent company of these divisions begins to analyze marketing data on a corporate level there can be serious consequences if different terminologies are not aligned.  Even worse, similar terms may be used across divisions to express different meanings. In a worse case scenario a corporate team may make decisions on marketing plans and budgets impacting millions of dollars based on inaccurate data simply because of conflicting terminology.

Organizations need to determine who within the company is responsible for governing these vocabularies and sharing accountability. Regular audits can help determine not just benchmarks for achieving data maturity but also assess vulnerabilities and potential risks.

Key Takeaway: Companies need to be proactive about creating a data governance framework and take into account their organizational structure, workflow, internal and external politics, and cultural factors.

Not all data is created equal.

In recent years, publishers have progressively tried to wrest power back from advertisers, particularly with the rise of automated media buying. The emergence of data brokers, ad exchanges and networks isn’t new, but the number of players has exploded in recent years. Yet even as the landscape looks increasingly atomized, brands have wisely opted to use their own data, or first-party data produced directly by publishers.

In an attempt to create unique offerings for marketers, publishers often layer third party data over their own inventory to increase targeting capabilities. However, third party data is increasingly commoditized, and it’s just as easy (and inexpensive) for a brand to buy that data independently.

Many ad sellers and browsers (including Safari, Firefox and Explorer) have begun to push back against the use of third party cookies, reducing the value of third-party data. Recently, Time Inc. and Conde Nast debuted advertising products based on first-party data that allow marketers to target consumers with more sophisticated audience segments both online and offline.

Key Takeaway: Organizations should focus on building broader as well as deeper first-party data stacks before investing in third party resources.

When building their data architecture, businesses should think about each piece as part of a larger puzzle. There is high value to internal data but most organizations do not have all of the data they need. Instead, organizations should consider their overall goals and figure out which data assets they own internally, which exist within the larger organization and what needs to be acquired or built via techniques such as lookalike modeling to achieve their goals.

*With Tom O’Reilly, Director, Huge Content

Please view the original post here.

Share:

Share

Comments

Comments are closed.