Analytics is changing. How are you keeping pace?


SOURCE: CIO.COM
NOV 14, 2022

Analytics have evolved dramatically over the past several years as organizations strive to unleash the power of data to benefit the business. While many organizations still struggle to get started, the most innovative organizations are using modern analytics to improve business outcomes, deliver personalized experiences, monetize data as an asset, and prepare for the unexpected.

Modern analytics is about scaling analytics capabilities with the aid of machine learning to take advantage of the mountains of data fueling today’s businesses, and delivering real-time information and insights to the people across the organization who need it. To meet the challenges and opportunities of the changing analytics landscape, technology leaders need a data strategy that addresses four critical needs:

  • Deliver advanced analytics and machine learning that can scale and adapt to whatever evolutions in applications and data science the future may hold.
  • Break down internal data silos to create boundaryless innovation while enabling greater collaboration with partners outside of their own organization.
  • Embrace the democratization of data with low-code/no-code technologies that offer the insight and power of analytics to anyone in the organization.
  • Embed analytics into business processes to create more compelling, relevant customer experiences and insights in real time.

Building a foundation for flexible and scalable analytics

Migrating analytics from on-premises systems to the cloud opens a realm of applications and capabilities and has allowed organizations to gradually shed the restraints of legacy architecture, with the proper controls in place.

“The migration of advanced analytics to the cloud has been an iterative, evolving process,” said Deirdre Toner, Go-To-Market leader for AWS’s analytics portfolio of services. AWS doesn’t recommend that organizations try to completely re-create its on-premises environment in the cloud. “Migration works best by considering the guardrails and processes needed to collect data, store it with the appropriate security and governance models, and then accelerate innovation,” Toner said. “Don’t just lift and shift with the old design principles that caused today’s bottlenecks. This is an opportunity to modernize and break down old architectural patterns that no longer serve the business.”

The goal is a data platform that can evolve and can scale almost infinitely, using an iterative approach to maintain flexibility, with guardrails in place. “IT leaders want to avoid having to re-do the architecture every couple of years to keep pace with changing market requirements,” said Toner. “As use cases change, or if unforeseen changes in market conditions suddenly emerge – and they surely did during the pandemic – organizations need to be able to respond quickly. Being locked into a data architecture that can’t evolve isn’t acceptable.”

Aurora – a company transforming the future of transportation by building self-driving technology for trucks and other vehicles – took advantage of the scalability of cloud-based analytics in the development of its autonomous driver technology. Aurora built a cloud testing environment on AWS to better understand the safety of its technology by seeing how it would react to scenarios too dangerous or rare to simulate in the real world. With AWS, Aurora can run 5 million simulations per day, the virtual equivalent of 6 billion miles of road testing. Aurora combined its proprietary technology with many AWS database, analytics, and machine learning solutions, including Amazon EMR, Amazon DynamoDB, AWS Glue, and Amazon SageMaker. The solutions helped Aurora reach levels of scale not possible in a real-world testing environment, which accelerated their innovation capabilities.

Moving beyond silos to “borderless” data

Integrating internal and external data and achieving a “borderless” state for sharing information is a persistent problem for many companies who want to make better use of all the data they’re collecting or can have access to in shared environments. Toner emphasized the importance of breaking down data silos to become truly data driven.

Organizations also need to explore new ways to harness third-party data from partners or customers, which increases the need for comprehensive governance policies to protect that data. Solutions such as data clean rooms are becoming more popular as a way to leverage data from outside providers, or monetize proprietary data sets, in a compliant and secure way.

AWS Data Exchange makes it easy for customers to find, subscribe to, and use third-party data from a wide range of sources, Toner said. For example, one financial services customer needed a better way to quickly find, procure, ingest, and process data provided by hundreds of vendors. But its existing data integration and analysis process took too long and used too many resources, putting at risk the bank’s reputation for providing expert insights to investors in fast-changing markets.

The company used AWS Data Exchange to streamline its consumption of third-party data, enabling teams across the company to build applications and analyze data more efficiently. AWS Data Exchange helped the firm eliminate the undifferentiated heavy lifting of ingesting and getting third-party data ready, freeing developers to dedicate more time toward generating insights for their clients.

Making analytics accessible to the masses

The consumerization of data and the broad applicability of machine learning have led to the emergence of low-code/no-code tools that make advanced analytics accessible to non-technical users.

“The simplification of tools is a crucial aspect of changing how a user prepares their data, picks the best model, and performs predictions without writing a single line of code,” said Toner. Amazon SageMaker Canvas and Amazon QuickSight are two examples of the low-code/no-code movement in machine learning and analytics, respectively.

SageMaker Canvas has a simple drag and click user interface that allows a non-technical person to create an entire machine learning workflow without writing a single line of code. QuickSight Q, powered by machine learning, makes it easy for any user to simply ask natural language questions and get answers in real time.

Embedding insights and experiences

Toner emphasized the importance of understanding that the types of people who need access to data across the business are expanding. “You can’t just build an analytics environment that serves a handful of developers and data scientists,” she said. “You need to make sure that the people who need data for decision making can find it, access it, and interpret that data in the moment it is important to them and the business.”

A cloud-based data strategy makes it possible to embed the power of data directly into customer experiences and workflows by making relevant data available as it’s needed. Toner used the example of Best Western, the hotel and hospitality brand using real-time analytics to give its revenue management team the capability to set room rates at any given moment. The result: improved revenue gains and the ability to be more responsive to customers.

“Best Western used to rely on static reports and limited data sets to set room rates,” Toner said. “Now, with QuickSight, they can access a much broader set of data in real time to get the insights they need to make better decisions and improve the efficiency of every team member.”

Addressing these four core components of modern analytics will help CIOs, CDOs, and their teams develop and deploy a data strategy that delivers value across the business today, while being flexible enough to adapt to whatever may happen tomorrow.

Learn more about ways to put your data to work on the most scalable, trusted, and secure cloud.

Similar articles you can read