Any federal IT pro can tell you that analyzing log files is something they’ve been doing for years. That said, as applications get more complex, performance becomes more important and security issues increase. Log analytics are fast becoming a critical component of an agency’s monitoring and management infrastructure.
At DLT, July is Big Data Month, where we will be highlighting all things big data in the public sector. To kick off Big Data Month, we sat down with DLT Chief Data Scientist Sherry Bennett to get her insights into what is going on the world of public sector data and analytics:
INTERVIEWER: So Sherry, what's your story and what led you to DLT?
Public sector leaders are facing an uphill battle when it comes to managing data. Their needs are growing while their budgets are often shrinking. How can federal agencies do more with less?
Leveraging enterprise open source solutions is part of the answer. They can help agencies close this gap and meet today’s needs—whether that’s analyzing warfighter data or building Smart Cities—while also planning for tomorrow’s challenges.
The U.S. Government spends trillions of dollars on benefits programs like Social Security, Medicare and Medicaid each year. Unfortunately, billions of those dollars are improperly paid, reducing the benefits to those who most rely upon them. In 2016, the White House estimated these losses at $144B.
Improper payments, fraud, and abuse takes many forms, consider some of these examples of Medicaid fraud and abuse:
With a constant influx of data, one of the biggest challenges facing government agencies is determining “What is the right data?” or “What data does my agency need for mission success?”
Then, once the data is discovered, how do you make that data actionable? How do you integrate and visualize it for better insights?