The end of the federal fiscal year is a great time for public sector decision-makers to consider how their respective organizations can get a head start on supporting the Federal Data Strategy Action Plan. The current draft of the plan, which is scheduled to be finalized by September, contains several recommendations required to be completed within a 3-, 6- or 12-month timeframe.
DLT Solutions recently sat down with NetApp Senior Director of U.S. Public Sector Channel Sales, David Drahozal, to discuss the recent revolution of NetApp's data fabric solutions.
DLT: So David, tell us a little about what you do at NetApp.
David: For over a decade, I’ve had the privilege of leading the US Public Sector channel here at NetApp. Its been a real exciting time as we have evolved from a NAS company to a leader in the consolidated virtualized data center, to today being a leader in hybrid cloud capabilities.
Capital One has announced that about 140,000 Social Security numbers and 80,000 linked bank accounts were compromised “in one of the biggest-ever data breaches,” affecting some 100 million individuals in the U.S. and 6 million in Canada.
The summer of 2019 is off to a great start for data professionals seeking to make valuable contributions working in the federal public sector. After several solicitations for public comments over the last year, the Office of Management and Budget (OMB) has at last issued the final draft of the Federal Data Strategy. The Federal Data Strategy by design is intended to help the government accelerate the use of data to drive and deliver mission objectives.
A survey of Department of Defense employees commissioned by DLT partner, Veritas, and conducted by Federal Computer Week found that 46% of respondents agree that data drives all or most of their decisions (58%), yet only 13% would rate their data management capabilities as “Excellent” while 60% rate them as “Satisfactory” or “Poor”.
Challenges Across Each Stage of Data Life Cycle
The purpose of this workshop is to provide senior-level executives with a framework and set of actionable steps to understand and assess the data maturity of their organization. By leveraging the Federal Government Data Maturity Model (FGDMM), attendees will engage in "hands-on" exercises to articulate a data strategy and roadmap for their agency.
At this workshop, you will hear from a number of industry leaders as they share their experiences with Data Strategy and Management, including:
Public sector organizations generate huge volumes of log data each day from servers, virtualization infrastructure, databases, security systems, applications, and more. And, according to IDC, unstructured data is growing at an annual compound rate of 60%. But due to its unstructured nature, that data, often called machine data, is much harder to analyze than structured data.
Technological innovations in the storage and computing capacity in the world of data has moved exponentially – especially in the healthcare space. One of the most dramatic use cases is in the field of genomics. As Andrea Norris, the CIO for NIH mentioned at a public sector Healthcare Summit on IT Modernization last week, technology has greatly increased the speed and lowered the costs associated with gene sequencing. Originally, when the human genome project completed the task, it took around 13 years with funding around 2.1 billion dollars (the project was from 1990–2003)
Expert Panel: The Challenges and Opportunities for Modernizing Data Protection
As online data has become ubiquitous, managing that data has become as important an endeavor as amassing and storing it. A host of issues surround data management, not the least of which is security. But many others loom as data increases exponentially both in size and in importance.