Data needs to flow to be of any use. That flow needs to pervade the often complex infrastructures through which data is processed and delivered and mirror the rhythm of activities within which the data is being originated. Data flow also needs to keep pace with the cadence under which data is being consumed in actual applications. If the flow is interrupted, data’s value diminishes rapidly.
Flow lives in the moment, but the moment must derive its momentum from an ongoing stream of data-powered decisions. Every data steward knows the quality of data comes in part from its latency—the speed at which data flows from the source and, as a consequence, the likelihood it represents the latest, greatest representation of some state of affairs. But data’s quality also flows from the extent to which the latest feed updates and extends a deep pool of high-quality historical data that’s been maintained in data warehouses, operational data marts, and other systems of reference.
In IBM Data magazine the week of March 2, 2015, three new articles dissect how a steady flow of reliable data brings value to diverse usage scenarios. Facilitating the steady rush of quality data means everything in a world in which critical decisions are made in the moment, in real time, day and night, without interruption.
Mobility is driving the new world of real-time streaming data flows. First-time contributor Erik Burckart beautifully articulates the value that sub-second data flows, contextualized data, and elegant design bring to the usage of mobile applications. Citing a Nielsen study, Burckart briefly summarizes the impact of a mobile moment: “A wait time of a tenth of a second gives the feeling of instantaneous response—that is, outcomes feel as though they were caused by the device users, not their devices. A one-second response helps them maintain a seamless train of thought. They can sense a delay, but they still feel in control of the overall experience. By three seconds, device users begin to feel frustrated, and at five seconds, they are unhappy.”
Data warehouses are the central repository of master data for many applications, including those that drive mobile and other in-the-moment usage scenarios. To provide continuing value in a fast-changing world, the data warehouse needs an adaptive, metadata-driven architecture that facilitates the flow of changes to the data model and other design features, according to David Birmingham. “This need to functionally morph in the data model–facing architectural core requires the greater resilience of adaptive architecture,” he states. “A failure in this point is why many data warehouses go stale or even incomprehensible over time. The warehouse’s data model requirements were in vogue when first captured, but six months later when deployed as the data warehouse, those requirements were already going stale with new ones arriving. If the data warehouse is strongly lashed to the original requirements, it could wax obsolete as the incoming requirements shift out from under it.”
The flow of engagement between human users and artificially intelligent systems is a core feature of cognitive computing systems such as the IBM® Watson™ technology. Defining a synergistic flow means everything when attempting to improve the productivity of people who try to leverage cognitive computing when extending their abilities for discovering new insights and using them to accelerate positive outcomes. Intelligent systems have long been thought to one day become capable of not just supplementing human cognition, but also—even under various circumstances—convincing humans that machines exhibit true intelligence. In my latest article, I muse on how the flow of a Turing test—that theoretical challenge that blurs end-user and programmatic interfaces to impersonate intelligence—might conceivably be mimicked far short of replicating the full power of the human mind.
Thanks for reading and engaging. And please check out our latest NewsBytes and upcoming events for opportunities to educate yourself on the power of data.
Editor in Chief, IBM Data magazine
Data needs to flow to be of any use. That flow needs to pervade the often complex infrastructures through which data is processed and delivered and mirror the rhythm of activities within which the data is being originated.