YOU ARE AT:Big Data AnalyticsDigital transformation and the data disconnect: 'A lot of people are still...

Digital transformation and the data disconnect: ‘A lot of people are still figuring it out’

Recent NI report identifies a performance gap between companies with “advanced” data strategies

Digital transformation as a concept has been floating around enterprise and industrial circles for some time. But it’s still not necessarily a given that companies are successfully pursuing it and have a coherent data strategy that is providing value for them.

A recent NI report makes clear that companies with advanced data strategies are seeing performance advantages over those who have make more limited use of data and analytics. But “advanced” is a relative term, according to Mike Santori, fellow at NI, and often the company finds itself working with its customers to answer basic questions about value, and how and what data to collect and connect.

“There’s a certain numbness that I think sometimes we all feel, about talking about digital transformation and analytics and everything, that I think really belies the reality that a lot of people are still figuring it out and trying to get value,” he says. “I think the original view of the problem was, well, just collect a lot of data and dump it someplace and you’ll magically throw a throbbing brain at it and it’ll just figure it out. That is far, far from the case.” Still, he confirms, the report confirmed NI’s anecdotal experience that its customers are highly interested in improving their strategic use of data.

“The overall level of interest has really, really increased in the last year or two,” Santori says.

The report is based on a survey of more than 300 “product innovators” that was conducted in May of this year and gathered information across 10 industries, including semiconductors, consumer electronics, transportation and aerospace and defense. NI’s research found that companies “with more advanced product data strategies are seeing better business outcomes.”

The report revealed a disconnection between the concept of a data strategy and actually having or implementing one. Fully 65% of respondents said that a data strategy was “essential” to optimizing a product’s lifecycle; and 47% reported that their company’s current data strategy was either “limited” and “not advanced”. But more than half—55%—also said that the cost of transforming their current product lifecycle was so high that they couldn’t justify the investment—even though 46% also said that they would probably lose market share in the next few years if they didn’t make major changes to their lifecycle processes.

Even so, 70% of those with “limited” data strategies say they have invested in product data and analytics as a priority in the past 12 months. Their more advanced counterparts, meanwhile, “are more likely to prioritize cutting-edge technologies such as machine learning, digital twins, and robotic process automation (RPA),” the report found.

“We believe the holy grail [of data strategy] will always be an end point that you’re always going to be striving towards,” says Santori, going on to add that “What we see, and what we’ve learned from our customers as well as this study, is that the really advanced capabilities come when you’re working across silos in a company. … Connecting engineering to manufacturing to end use”, or even delving into the supply chain for additional insights to draw upon.

“The more advanced people are connecting across functional boundaries,” he explains, adding, “They enable connections across the entire life cycle, because design issues and test issues relate to manufacturing issues, relate to end use issues.”

Product-related data from testing is ripe for such connection, but the report found that test data was “the most underutilized resource.” One-third of the survey respondents said that an “inability to integrate or gain insights from test data” was preventing them from improving their product lifecycle. Santori says he knows why.

“Test is often thought of as an after-thought, or an extra step; so it’s a separate group that does the testing. Things are kind of thrown over the wall,” he explains. “Test systems tend to stay around for a long time, and test engineers will tell you, ‘no way do I want IT messing with my test system.’ Well, that means the way the data is stored is not accessible.” In industries like aerospace and defense, funding and programs (and thus the resulting data) are kept separate. There is also a lot of highly manual test data out there, living in spreadsheets where it can’t be easily accessed, and “it’s really crazy complicated data,” Santori adds. RF testing results in massive amounts of data in a short amount of time, and the design of a 5G radio or part also results in large amounts of complex data.

“It’s just hard, nasty, hidden stuff,” he concludes. In addition, companies tend to be focused on hitting a product home run rather than its component parts, he says. “They think of all the money that they have to spend to achieve this grand vision, and what we end up working with them is on basics. Do you understand the problems that would be most valuable to fix? If it’s time to market, why is it time to market? Is design taking too long, is test disconnected?”

The answers to those questions can provide a starting point for a data strategy that isn’t an all-or-nothing monolith. NI’s report suggests that companies start with getting agreement on identified areas for improvement, then work backward to identify the relevant data sources that could be pulled in, then develop a standardization strategy so that digitalization and improved use of data-based feedback can be scaled and applied to new business areas.

Read more from NI’s report here.

ABOUT AUTHOR

Kelly Hill
Kelly Hill
Kelly reports on network test and measurement, as well as the use of big data and analytics. She first covered the wireless industry for RCR Wireless News in 2005, focusing on carriers and mobile virtual network operators, then took a few years’ hiatus and returned to RCR Wireless News to write about heterogeneous networks and network infrastructure. Kelly is an Ohio native with a masters degree in journalism from the University of California, Berkeley, where she focused on science writing and multimedia. She has written for the San Francisco Chronicle, The Oregonian and The Canton Repository. Follow her on Twitter: @khillrcr