The United States spent $554 billion in discretionary contracts for goods and services during FY18, covering the gamut from janitorial services to hand grenades. This number ranks higher than Sweden’s gross domestic product, as well as 165 other countries’ GDP.
Since the release of the final Federal Data Strategy Action Plan (henceforth FDSAP) the end of last year, much has been written about the Federal Data Strategy. However, little has been written to connect the important role it can play in enabling the adoption of AI within the federal public sector.
Is data the new bacon? The world’s most valuable resource? That fuel that powers the digital enterprise?
The most successful companies in the world – Google, Facebook, Amazon, Netflix, etc. – use data to drive business strategies. Being insight-driven isn’t just the domain of consumer tech. Government agencies rely on data to make informed and quick decision making, enhance productivity, improve transparency and build trust with citizens, eliminate fraud and abuse, reduce crime and security threats, and more.
Data continues to transform the public sector in meaningful and compelling ways. As a vast creator and consumer of data, government agencies have a unique opportunity to revolutionize their decision-making power by utilizing data to the fullest.
The summer of 2019 is off to a great start for data professionals seeking to make valuable contributions working in the federal public sector. After several solicitations for public comments over the last year, the Office of Management and Budget (OMB) has at last issued the final draft of the Federal Data Strategy. The Federal Data Strategy by design is intended to help the government accelerate the use of data to drive and deliver mission objectives.
A survey of Department of Defense employees commissioned by DLT partner, Veritas, and conducted by Federal Computer Week found that 46% of respondents agree that data drives all or most of their decisions (58%), yet only 13% would rate their data management capabilities as “Excellent” while 60% rate them as “Satisfactory” or “Poor”.
Challenges Across Each Stage of Data Life Cycle
The purpose of this workshop is to provide senior-level executives with a framework and set of actionable steps to understand and assess the data maturity of their organization. By leveraging the Federal Government Data Maturity Model (FGDMM), attendees will engage in "hands-on" exercises to articulate a data strategy and roadmap for their agency.
At this workshop, you will hear from a number of industry leaders as they share their experiences with Data Strategy and Management, including:
Public sector organizations generate huge volumes of log data each day from servers, virtualization infrastructure, databases, security systems, applications, and more. And, according to IDC, unstructured data is growing at an annual compound rate of 60%. But due to its unstructured nature, that data, often called machine data, is much harder to analyze than structured data.
Technological innovations in the storage and computing capacity in the world of data has moved exponentially – especially in the healthcare space. One of the most dramatic use cases is in the field of genomics. As Andrea Norris, the CIO for NIH mentioned at a public sector Healthcare Summit on IT Modernization last week, technology has greatly increased the speed and lowered the costs associated with gene sequencing. Originally, when the human genome project completed the task, it took around 13 years with funding around 2.1 billion dollars (the project was from 1990–2003)