DLT Solutions recently sat down with NetApp Senior Director of U.S. Public Sector Channel Sales, David Drahozal, to discuss the recent revolution of NetApp's data fabric solutions.
DLT: So David, tell us a little about what you do at NetApp.
David: For over a decade, I’ve had the privilege of leading the US Public Sector channel here at NetApp. Its been a real exciting time as we have evolved from a NAS company to a leader in the consolidated virtualized data center, to today being a leader in hybrid cloud capabilities.
Federal government agencies as well as states and cities are rich repositories of data – data on everything from health to public safety, education to the environment. But those same organizations have moved beyond being isolated data storehouses. Data is no longer locked away on devices and storage drives, hidden behind firewalls. Instead, it’s becoming distributed (cloud and on-premises), dynamic (the velocity of data from sensors, citizen attributes, etc. is constantly fluctuating) and diverse (structured, unstructured, and streaming).
Oh DevOps, DevOps.
You hear time and again how it’s the future of application development and deployment. You’re told you need to implement it and engrain its best practices across your organization.
But making a shift from the old way of doing things, however error-prone, slow, or disruptive it may be in comparison to the agility and utility that DevOps promises, is no easy task.
Hybrid clouds are increasingly seen by government agencies as a comfortable solution to both their desire and mandates to move to the cloud. Highly sensitive data can be stored on-premises in a private cloud – using their existing infrastructure – while lower risk data and non-sensitive computing workloads can be placed in the public cloud or where they fit best. Hybrid clouds also let agencies quickly move when situations change.
“Big data” and the rise in demand for database services puts increasing pressure on IT departments. End users demand access to data at higher speeds and management wants that data delivered and managed at a lower operational cost.
Data stores are expanding at an exponential rate, compromising the performance of servers and applications. This growth also adds to the daily challenges and pains that administrators face – maintaining database health while meeting ever-evolving mission requirements.
With more and more software and services being offloaded to the cloud and the surge in big government data, the speed of your storage still matters – very much.