[Webinar] Army Chief Discusses How Agencies Can Lay Foundations for Next Gen Data Center

Hybrid clouds are increasingly seen by government agencies as a comfortable solution to both their desire and mandates to move to the cloud. Highly sensitive data can be stored on-premises in a private cloud – using their existing infrastructure – while lower risk data and non-sensitive computing workloads can be placed in the public cloud or where they fit best. Hybrid clouds also let agencies quickly move when situations change.

Managing Today’s Virtualization Challenges by Looking at the Past and Predicting the Future

Can you afford for your team to lose eight hours a day? According to the 2016 State of Data Center Architecture and Monitoring and Management report by ActualTech Media in partnership with my company, SolarWinds, that’s precisely what is happening to today’s IT administrators when they try to identify the root cause of a virtualization performance problem. And that doesn’t even take into account the time required to remediate it.

Virtualization, the dark side

The race to virtualize everything has created a host of unintended consequences, not the least of which is how to meet the SLAs (service level agreements) for application backup. As we move into cloud alternatives this problem will only grow since your cloud provider will have to provide this to you on an application by application basis. Every virtual machine is essentially a set of large files such as VMDKs in a VMware context. These large files are typically stored in storage arrays which can be connected via iSCSI or Fiber Channel or on NFS volumes. Traditional data protection techniques such as VMware's VADP, or VMware VCB rely on an agent to protect VMDK files associated with virtual servers.

Business Impact Analysis: The Foundation of a Disaster Recovery Plan

Consider the following statistics taken from the Disaster Recovery Journal (Winter 2011): • A single incident of data loss can cost a company an average of $10,000.00 • 93 percent of companies that lost their data for 10 days or more, filed for bankruptcy within a year. • 40 percent of businesses that suffer a loss of data, fail within 5 years. And while most companies and organizations have taken Disaster Recovery seriously, they often fail to take a proper BIA or Business Impact Analysis and properly test their plan for appropriateness; often resulting in losses. A BIA or a Business Impact Analysis is exactly what it sounds like; proper research to determine what the business impact would be if an application, website, database, HR document, etc… were not available for given sets of time. Perhaps if a database were not available for an hour there would be little impact, but if it were down for a day, it would be critical. It is important to do an accurate study to determine where those pain points are for all aspects of your organization and review them regularly for changes in criticality. While this sounds like the absolute foundation for all DR plans (and it is) I have regularly encountered both government and private industry that fail to do this most basic step. They either consider everything to be critical (it isn’t) or they only backup a few servers that they think contain their most important documents/data. Neither of these plans accomplishes suitable DR.

To Share or Not to Share, That is the Question

When Microsoft introduced the Windows 2000 operating system, they introduced a new way of managing disks and the capacity they contained. Developed in conjunction with VERITAS Software, Disk Management in Windows 2000 uses the concept of dynamic disks to allow users to dynamically configure their storage i.e. no reboots. The default disk type, referred to as basic, allows for the creation of partitions and logical drives, much in the same vein as Disk Administrator in Windows NT.

NetApp and Quantum Synergies

With many of us experiencing explosive growth, companies niched in broadcast, media, and entertainment are left trying to figure out how to improve data availability while minimizing cost.  Here at DLT we are helping customers:
  • Eliminate wasteful copies
  • Create centralized repository
  • Simplify management
  • Move data to the right storage tier automatically
How do we do this?  We at DLT have partnered with the industries best of breed vendors and forged them together to create a powerful solution.  The

There's A Storm Rising – Data Center Consolidation

The Federal Government has recently embarked on an initiative to consolidate its 1,100 data centers - the advertised primary objective being cost reduction, primarily through energy savings. While a simplistic approach to consolidation (virtualize everything and use lots of Clouds) may be tempting, changes taking place in the IT industry argue for careful consideration of the technology suites now coming to market. The storm that I see rising is the war between Oracle-Sun, H