Big Data Month: New eBook Sheds Light on How Government is Overcoming Persistent Big Data Challenges
There are many opportunities in the public sector for data science and data analytics, yet, almost as many challenges. When we kicked off Big Data Month at DLT, we asked our Chief Data Scientist, Sherry Bennett, for her insights. What became clear is that the obstacles to big data success are universal to both the public and private sector: “…everybody…is grappling with the same thing. They are trying to figure out how to take data that is in different systems and integrate it so that you have a full view of your customer and business operations. Everybody is still struggling with that. How to integrate and how to best utilize that integration.”
Why is data integration so hard? Let’s break it down.
Data is Evolving
Technology is moving forward at a rapid pace, bringing with it new data sources – IoT, sensor-enabled equipment, social network feeds, data warehouses, SaaS, etc. and data types – structured, unstructured, raw and processed. Unique applications and/or complex translations are often required to simply view the data, and correlation of the data in these environments is both complicated and costly. In many instances, these systems have no way of communicating. Data is also taken on another attribute – velocity. The speed at which data is being received and acted upon is increasing.
Legacy data storage systems and architectures are ill-equipped to deal with these characteristics of the modern data enterprise. For example, any agency that processes benefits and grants must deal with a subset of applicants who may try to fraudulently claim benefits to which they are not entitled. Most governments face this risk, and their aging technology infrastructures are vulnerable.
In many instances, claims audit systems still run on decades-old technologies. A typical batch job might take days to process. Furthermore, these systems may only have access to incomplete or stale data. This limits an agency’s ability for data discovery and makes it easier for perpetrators of fraud to avoid detection.
Cold Hard Cash
And then, of course, there’s money. Finding ways to manage all this data insanity on shrinking budgets, which gives users the capability they need to solve problems, glean insights, and make data actionable, means a lower cost data architecture is needed.
How can you overcome these challenges? That’s the topic of our latest eBook, written in collaboration with Hortonworks. A quick read, using use cases and examples the books explains how public sector agencies can build a modern, secure, and open source data architecture that:
• Stores and analyzes data at massive scale
• Connects people and devices across boundaries and roles for better collaboration
• Extracts critical insights from all types of data from any source
• Reduces data storage costs with inexpensive commodity storage