Q&A: Kirk Kern, CTO, Cloud Technologies of NetApp, Discusses Next-Gen Data Mining for Government
The rise of anything as a sensor and the “Internet of Things” is creating a data deluge for federal agencies. While this information is valuable for enhancing decision-making that supports mission goals and requirements, making this data truly actionable can be challenging for any organization.
Fortunately, there are data mining solutions and strategies that allow government agencies to manage this influx of data, while also using it for real-time decision-making.
TechSource recently had the opportunity to sit down with Kirk Kern, US Public Sector CTO, Cloud Technologies of NetApp, who shared his key insights into the future of data mining and government.Below is the full interview.
TechSource: Let’s start at a high-level. Tell us about why the right data mining solutions are so important to government agencies.
Kern: If you look back over the past five years, we have seen a data evolution, which has spanned everything from grid computing to big data to now the cloud for storing and processing information.
Along with this, there are three information management principles of data mining, which are data creation, protection and intelligence generation. From capturing the information to protecting it to developing actionable insights from it, each principle has its own consumption model and associated software service. This is why federal agencies need the right solutions to meet the needs of these three principles.
TechSource: Will the “massive data proliferation” trend in government continue into the future?
Kern: Clearly, the federal government continues to deploy more and more assets that play a major role in the proliferation of data. In addition, if you look at the soon-to-be-rise of the “Internet of Things,” virtually everything will become a sensor with every device having a point of presence on the Internet.
With all of these changes, the government will continue to need to mine this data in ways that turn the information into actionable intelligence that can improve our lives or make our government programs more useful. It’s hard to predict what kind of intelligence can be pulled from these data sources, but it is clear that the data deluge will continue and that this will be an ongoing challenge.
TechSource: How can government agencies best leverage next-generation data mining solutions?
Kern: Conceptually, we envision the cloud as being a huge contributor to the storing, accessing and managing of all of this data. What we are attempting to do is federate the data and stitch it all together into a “data fabric,” where it is weaved together into one core asset.
We are approaching this from the perspective that this information has to be shared with the proper security mechanisms in place. The data also has to be searchable from any endpoint. In addition, as we migrate from the larger on premise storage engines, we will be seeing some services connected with the cloud, where the information can be accessed from a mobile device, as well as the traditional compute environment. Essentially, we are building the right data packages to enable all of this to be a reality.
TechSource: How can cloud-based solutions like the hybrid cloud help with this challenge? Tell us about NetApp’s solutions in this arena.
Kern: We are taking NetApp capabilities and virtualizing them to build a platform that can operate as software defined storage or storage-as-a-service framework. As we move towards the disembodied services framework, our virtualized storage systems can operate anywhere and the software services can be defined by and operate in the cloud.
This allows federal agencies to balance their public and private cloud resources to match their mission needs with our hybrid cloud solution. Agencies can leverage their on-premise private clouds – with storage and discs and racks – while also embracing a more shared environment of the public cloud.
TechSource: Anything to add?
Kern: Beyond this “data fabric” that we are weaving, we are seeing a migration to a different set of software standards for managing the data. We tend to see many protocols being leveraged. As we move up the stack to virtualization and cloud-based sharing, there will be the need for a standards based long-term data repository. NetApp is currently embracing open data initiatives for a unified approach to developing this interoperability framework.
We would like to thank Kirk for taking the time to speak with TechSource. To learn more about NetApp’s Data Management Solutions for Government Organizations, click here.
TechSource in your Inbox
Sign-up here to receive our latest posts by email.