Federal government agencies as well as states and cities are rich repositories of data – data on everything from health to public safety, education to the environment. But those same organizations have moved beyond being isolated data storehouses. Data is no longer locked away on devices and storage drives, hidden behind firewalls. Instead, it’s becoming distributed (cloud and on-premises), dynamic (the velocity of data from sensors, citizen attributes, etc. is constantly fluctuating) and diverse (structured, unstructured, and streaming).
Another day, another government ransomware victim. On March 22nd, 2018, the city of Atlanta found itself locked out of computers across government offices and facing a ransom demand of $51,000 or $6,800 per computer, GCN reported.
No sooner do you have your arms around one cybersecurity vulnerability then another surfaces. This time it’s Meltdown and Spectre, both of which can cause data leak from kernel memory. These vulnerabilities are particularly worrying since they impact practically all computers and involve multiple IT vendors including processor players Intel, AMD, Qualcomm, and ARM.
Agencies are dealing with an exponential growth of data. But size isn’t the only problem. It’s where that data lives and how it travels between the private clouds, public clouds, and back to on-premises.
How do you protect, secure, and backup that data? How can your agency protect the right data and invest only in what is important to the mission, without creating a new set of data silos, incurring hidden storage costs, stalling developers, and introducing greater compliance risk?
From streaming movies to checking email 24x7, we take “always-on” data applications for granted. But in the public sector, this always-on mindset is a little slower to catch on.
From cyber-attacks to power outages, data center outages to application failure, IT outages are an ongoing problem for the U.S. government.
As the federal government continues to navigate the transition of the new administration, the imperative is on new political appointees to understand and adhere to records management rules and regulations while safeguarding classified and other personally identifiable information (PII). All this, of course, against the backdrop of exponential records growth, budgetary challenges, and other mission pressures.
Hybrid clouds are increasingly seen by government agencies as a comfortable solution to both their desire and mandates to move to the cloud. Highly sensitive data can be stored on-premises in a private cloud – using their existing infrastructure – while lower risk data and non-sensitive computing workloads can be placed in the public cloud or where they fit best. Hybrid clouds also let agencies quickly move when situations change.
As costs have declined, all-flash storage has become the de facto enterprise standard for primary storage and the foundation for any cloud – internal, SaaS, or public. For the public sector, flash can address agency modernization efforts (mobile, data analytics and cloud) and help cut costs and reduce power and space requirements, despite growing data sets.
Consider the benefits:
As federal, state and local agencies amass more and more structured and unstructured data, the use of data analytics can help save time and resources in making informed decisions based on that data. However, challenges remain as agencies struggle to find the right skill set and manage the complexity of implementations for multi-structured data platforms.