Law enforcement and intelligence agencies deal with large volumes of disparate data on a daily basis. Analyzing all this data across multiple data silos and structures, each with their own levels of security permissions, is a big challenge.
The Federal Data Strategy principles (https://strategy.data.gov/principles), as currently articulated, are a set of best practices and guidelines, which could be utilized to govern the development and maturity of an organization’s management of data as an asset. However, without guidance or a framework within which to actualize these principles, these principles may well be rendered a wish-list.
Any federal IT pro can tell you that analyzing log files is something they’ve been doing for years. That said, as applications get more complex, performance becomes more important and security issues increase. Log analytics are fast becoming a critical component of an agency’s monitoring and management infrastructure.
So much data, so little time. Disparate sources such as sensors, machines, geo-location devices, social feeds, server and security system logs, and more, are generating terabytes of data at unfathomable speeds. Getting any kind of real-time insight and, we dare you to dream, acting on that data as it flows in, is not an easy feat for resource-constrained government agencies.
Data is everywhere in government but turning that data into actionable information and insights remains a persistent problem. “We are data rich and information poor,” said Shelley Metzenbaum, a former associate director for performance and personnel management at the Office of Management and Budget (OMB) at a recent IBM Center for The Business of Government session on “Envision Government in 2040”.
At DLT, July is Big Data Month, where we will be highlighting all things big data in the public sector. To kick off Big Data Month, we sat down with DLT Chief Data Scientist Sherry Bennett to get her insights into what is going on the world of public sector data and analytics:
INTERVIEWER: So Sherry, what's your story and what led you to DLT?
Agencies are dealing with an exponential growth of data. But size isn’t the only problem. It’s where that data lives and how it travels between the private clouds, public clouds, and back to on-premises.
How do you protect, secure, and backup that data? How can your agency protect the right data and invest only in what is important to the mission, without creating a new set of data silos, incurring hidden storage costs, stalling developers, and introducing greater compliance risk?
The promise of the latest Facilities Management (FM) software is an exciting one of reduced asset and space management costs, increased productivity and a better-managed asset lifecycle. One downside of implementing new technology is that it increases productivity in one area and exposes or exacerbates bottlenecks or gaps in data or processes in another. Facilities management implementations have an Achilles’ heel called data.
As federal, state and local agencies amass more and more structured and unstructured data, the use of data analytics can help save time and resources in making informed decisions based on that data. However, challenges remain as agencies struggle to find the right skill set and manage the complexity of implementations for multi-structured data platforms.
In some ways, data has become just as much a colleague to federal IT managers as the person sitting next to them. Sure, data can’t pick up a burrito for you at lunchtime, but it’s still extraordinarily important to agency operations. Data keeps things going so that everyone in the agency can do their jobs – just like your fellow IT professionals.