Data Centers

June 22, 2017
The 5 Features of an Always-On Data Platform premium
From streaming movies to checking email 24×7, we take “always-on” data applications for granted. But in the public sector, this always-on mindset is a little slower to catch on. From cyber-attacks to power outages, data center outages to application failure, IT outages are an ongoing problem for the U.S. government. A 2015 report by MeriTalk, […]
premium
qa@dlt.com'
May 16, 2017
[Webinar] Army Chief Discusses How Agencies Can Lay Foundations for Next Gen Data Center premium
Hybrid clouds are increasingly seen by government agencies as a comfortable solution to both their desire and mandates to move to the cloud. Highly sensitive data can be stored on-premises in a private cloud – using their existing infrastructure – while lower risk data and non-sensitive computing workloads can be placed in the public cloud […]
premium
dana.suarez@dlt.com'
Information Architect
January 19, 2017
Managing Today’s Virtualization Challenges by Looking at the Past and Predicting the Future premium
Can you afford for your team to lose eight hours a day? According to the 2016 State of Data Center Architecture and Monitoring and Management report by ActualTech Media in partnership with my company, SolarWinds, that’s precisely what is happening to today’s IT administrators when they try to identify the root cause of a virtualization […]
premium
qa@dlt.com'
March 11, 2013
Technically News – 3/11 premium
In this edition: Symantec CTO: Enterprise Security Still Needs Humans; As Data Centers Consolidate, Those Remaining Need to be More Efficient; Complexity is Cybersecurity’s Real Enemy; Cybersecurity Challenges in 2013; GitHub Hires First Government Liaison.
premium
NewsJosh@dlt.com'
December 3, 2012
Technically News – 12/3 premium
Technically News: Government Closes More Data Centers, NetApp Links to Amazon’s Cloud, Cybersecurity Needs of the Borderless Enterprise, Agency Saves $500 Million Through Technology, and More
premium
NewsJosh@dlt.com'
November 7, 2011
Virtualization, the dark side premium
The race to virtualize everything has created a host of unintended consequences, not the least of which is how to meet the SLAs (service level agreements) for application backup. As we move into cloud alternatives this problem will only grow since your cloud provider will have to provide this to you on an application by application basis. Every virtual machine is essentially a set of large files such as VMDKs in a VMware context. These large files are typically stored in storage arrays which can be connected via iSCSI or Fiber Channel or on NFS volumes. Traditional data protection techniques such as VMware's VADP, or VMware VCB rely on an agent to protect VMDK files associated with virtual servers.
premium
terry.freeman@dlt.com'
September 19, 2011
Business Impact Analysis: The Foundation of a Disaster Recovery Plan premium
Consider the following statistics taken from the Disaster Recovery Journal (Winter 2011): • A single incident of data loss can cost a company an average of $10,000.00 • 93 percent of companies that lost their data for 10 days or more, filed for bankruptcy within a year. • 40 percent of businesses that suffer a loss of data, fail within 5 years. And while most companies and organizations have taken Disaster Recovery seriously, they often fail to take a proper BIA or Business Impact Analysis and properly test their plan for appropriateness; often resulting in losses. A BIA or a Business Impact Analysis is exactly what it sounds like; proper research to determine what the business impact would be if an application, website, database, HR document, etc… were not available for given sets of time. Perhaps if a database were not available for an hour there would be little impact, but if it were down for a day, it would be critical. It is important to do an accurate study to determine where those pain points are for all aspects of your organization and review them regularly for changes in criticality. While this sounds like the absolute foundation for all DR plans (and it is) I have regularly encountered both government and private industry that fail to do this most basic step. They either consider everything to be critical (it isn’t) or they only backup a few servers that they think contain their most important documents/data. Neither of these plans accomplishes suitable DR.
premium
Jennifer.Jackson@dlt.com'
To Share or Not to Share, That is the Question premium
When Microsoft introduced the Windows 2000 operating system, they introduced a new way of managing disks and the capacity they contained. Developed in conjunction with VERITAS Software, Disk Management in Windows 2000 uses the concept of dynamic disks to allow users to dynamically configure their storage i.e. no reboots. The default disk type, referred to […]
premium
blogs@dlt.com'
NetApp and Quantum Synergies premium
With many of us experiencing explosive growth, companies niched in broadcast, media, and entertainment are left trying to figure out how to improve data availability while minimizing cost.  Here at DLT we are helping customers:
premium
blogs@dlt.com'
There's A Storm Rising – Data Center Consolidation premium
The Federal Government has recently embarked on an initiative to consolidate its 1,100 data centers – the advertised primary objective being cost reduction, primarily through energy savings. While a simplistic approach to consolidation (virtualize everything and use lots of Clouds) may be tempting, changes taking place in the IT industry argue for careful consideration of […]
premium
blogs@dlt.com'