These days, there’s a constant stream of chatter about the importance of being a “real-time enterprise,” with the ability to sense and respond to any event or request in an instantaneous manner. Of course, there’s a technical definition of real-time, but to the business, being real-time goes beyond processing to the ability to simply act fast and respond at the time a response is need to a given situation, such as a customer needing a product upgrade.
So, what’s it really mean to be real-time? I ran this question past industry experts, and here are some of their insights. The bottom line is joining the real-time revolution requires business sense, data savvy, attention to the edge, and adoption of modern technologies.
Real-time technology is critical to organizations going forward into the 2020s, because in today’s fast-paced world, real time decision-making is a competitive differentiator. “Almost all modern applications require real-time capabilities,” says Dr. Vikram Ahmed, director of enterprise information systems at Stetson University. He points to the widespread adoption of mobile devices and systems, as well as the Internet of Things, which all move all require real-time data to operate as they are designed and intended.
The question is, then, how ready are enterprises for this shift? As it stands now, “most organizations remain further down the real-time streaming analytics maturity curve,” says Steve Sparano, principal product manager of IoT and event stream processing at SAS. “Generating real-time insights requires the ability to ingest data in real-time, structure, analyze, and append that data to customer profiles, and then act on it accordingly. Core resources and technologies include streaming analytics to move from near-real time to processing data in real time, as events are happening, while still re-directing and storing the data in traditional databases for reporting, visualizations, and model development.”
Technology itself “has not kept up with the explosion of data and the stress it can put on critical systems,” Joshua Odmark, CTO and co-founder of Pandio, points out. “Most technologies in existence today were built for big data analytics. Analytics only scratches the surface of what can be done with data, and as companies expand out from analytics, they require technologies that can handle more of everything. More data, more compute, more bandwidth, more labor, more connectivity, more operational support, or more insight.”
As a result, getting to a world full of real-time applications is still a work in progress, especially for mid-size and small companies that do not have the budgets and infrastructure in place, and are thus relying on .”end-of-the day processing” and “semi-automated data transfer protocols,” Ahmed says.
Of course, any technology wave is pushed and propelled by the applications that users need, and this is a key factor in the building momentum to real-time. “Real-time capabilities are in high demand in most analytical applications today,” says Odmark. At this time, “industries that are highly affected by time are driving innovation,” he says. “Machine learning applications are starting to explore real-time capabilities as they deal with inaccuracies with time series data and the gap between when a model can be trained and deployed.”
Sparano sees real-time applications arising with all industries and processes. “The emphasis is to deliver in-the-moment insights and decisioning, as the event happens, rather than days, or even weeks afterwards,” he says, citing examples of real-time in action: “Real-time analytics enable banks to detect fraudulent transactions and instantly determine the credit worthiness of online loan applicants. Manufacturers use IoT data and real-time analytics to detect and remedy failures and defects,” he says. “Machine learning helps retailers assess and influence buyer behavior with real-time offers, while visioning can help them enforce in-store social distancing. Marketers stream data in real-time from different channels and touchpoints and gather insights using machine learning and other predictive analytics techniques to guide consumers to conversion events and provide appropriate offers, messages, or content.”
Industry leaders outline at least four key steps to building a real-time enterprise:
Put business needs front and center: “Adopt the mindset that real-time is the future and create and alter both business and operational processes with a real-time-first attitude,” says Sparano. “Putting the technologies in place is the first step to building a real-time enterprise, but incorporating that technology into every business process is critical to fully building out the real-time enterprise.
Get your data in order: Getting to real-time also requires “robust data management that supports both emerging streaming data and traditional data sources for real-time data integration and capabilities such as data cleansing, periodicity, and imputations,” says Sparano. Odmark advises establishing “a strong foundational element called a data fabric. This is a middleware layer that acts as the connectivity layer between all enterprise systems. In the past, this has been referred to as an enterprise service bus, but the data fabric layer has grown to encompass all middleware functionality in an enterprise. Most importantly, this solves challenging data accessibility issues when doing anything real-time.”
Look to the edge: Getting to real-time also requires “implementation of real time analytics where the data originates and delivering analytics on the edge – where the sensors are, where the customers are in stores and online, and where fraudsters are perpetrating crimes at points of sale,” says Sparano. “This requires autonomous support to run analytics closer to the data source, without connectivity back to the cloud, thereby creating more flexible and powerful deployments.” Senthil Kumar, vice president of software engineering of FogHorn Systems, agrees, noting that the “key step to enabling a real-time enterprise is establishing a harmonious interplay between edge and cloud. With edge, organizations can ingest, enrich and analyze data locally, execute edgified machine learning models on cleaned data sets, and deliver enhanced predictive capabilities — versus cloud-heavy, expensive, retroactive insights.”
Look to more flexible or emerging technologies: Cloud-based systems may help bring organizations along into real-time capabilities, Ahmed says, along with “some critical requirements for real time data transfers across multiple systems, including web services like REST and secure tunnels.” Getting to real-time insights also means “tapping into and ingesting streaming data from IoT data sources that weren’t typically accessible using traditional means — machine data like open platform communications (OPC), Kafka message queues, cameras, audio sensors,” says Sparano. This also incorporates “capabilities to display and visualize patterns in the data, as well as apply the data to previously built predictive models in the moment, not after the fact in batch mode. This includes typical high-frequency data as well as data at-rest from sensor-specific sources, customer behavior, and patterns in financial transactions.”