Are you a master of VSS?
With NetBackup 7.0, Symantec is no longer using VSP (VERITAS Snapshot Provider). This leaves you with the wonderful Volume Shadow Copy Services (VSS) from Microsoft. Solutions are well documented through technical articles and blogs for some of the common problems with VSS. But maybe, just maybe, you are that type of engineer that really wants to be the master of VSS and understand more about how to troubleshoot it. Troubleshooting your VSS issues are as easy as setting up tracing.
Best Practices for Achieving Migration to a Cloud Model
The following is a transcription of the Best Practices for Achieving Migration to a Cloud Model webcast that was held on February 23, 2011 and hosted by i360Gov.com. You may also view the archived version by registering online.
Moderator:
Good afternoon, and welcome to today's webinar, Best Practices for Achieving Migration to a Cloud Model, brought to you by i360.gov, DLT Solutions, Red Hat and NetApp. We have a great line-up of speakers today. Our first speaker is Van Ristau, he's Chief Technology Officer with DLT Solutions. Our second speaker today is Dawn Leaf, NIST Senior Executive for Cloud Computing, Senior Advisor Information Lab at NIST. Our third speaker is Greg Potter, Research Analyst In-Stat.
Before we get started, I just want to go over a few housekeeping items. Anytime during the next hour, if you'd like to submit a question, just look for the "Ask a Question" console, and we'll field your questions at the end of the presentation. If you have any technical difficulties during the webinar, click on the "Help" button located below the slide window, and you'll receive technical assistance. And finally, after the session is complete, we'll be emailing you a link to the archived version of this webinar, so you can view it again or share it with a colleague.
And now, I'd like to hand it over to Van Ristau – Van.
NetBackup 7 Deduplication Should Be Everywhere
NetBackup offers a variety of ways to reduce storage capacity using deduplication. In fact, we believe our users should deduplicate everywhere. Backup is the killer app for deduplication. Why? Well a backup is essentially a copy of your information to be used for a recovery in the event of a corruption, or if something goes wrong. So it’s basically an insurance policy. So if you’re doing backups over the weekend and incrementals during the week and your data change rate isn’t that high, why backup the same thing over and over again? Just think of how many instances of a particular file or application that you have across your entire company. Say I send out a 3MB PowerPoint presentation for a review and a co-worker changes the title slide, another tweaks a bullet or two. Now there are 3 copies totaling 9 MB with only a few minor changes!
Because of examples like that, it’s probably no surprise that data is growing at an alarming rate. The problem is that network, server, and storage systems are trying to keep up with this growth. Many organizations are still relying heavily on tape, yet still implementing next generation technologies like virtualization. That makes the outdated practice of keeping data forever impossible, especially while trying to meet today’s more aggressive service levels. Symantec believes deduplication should be natively integrated into any backup application, and in fact with NetBackup 7 it is. Deduplication should live in every part of the information architecture, at the source and the destination.
Oracle Open World Recap: Part IV.5.2
Exadata
Larry, after expounding on the benefits of Oracle’s high performance cloud computing server—Exalogic— he went on to further tout version 3 of Oracle’s renowned database machine—Exadata. With the release of version 3, Oracle now offers customers two versions of its acclaimed database machine: X2-2 and X2-8.
“The new configuration extends the Oracle Exadata Database Machine product family with a high-capacity system for large OLTP, data warehousing and consolidated workloads. There are now four configurations of the Oracle Exadata Database Machine: the new Oracle Exadata X2-8 full-rack and the Oracle Exadata X2-2 quarter-rack half-rack and full-rack systems. Offering customers a choice of configurations for managing small to large database deployments, the Oracle Exadata X2-2 and Oracle Exadata X2-8 full-rack machines can scale to multi-rack configurations for the most demanding database applications.”
Larry emphatically proclaimed that Exadata has become the best machine for data warehousing and OLTP and he used SoftBank as an example. He indicated that at SoftBank Oracle replaced a 60-rack Teradata machines with only 3 full racks of Exadata and depending on the application those three (3) Exadata racks, ran 2xs to 8xs faster than the 60 rack Teradata configuration with only 5% of the hardware. Oracle eliminated 95% of the racks and on average still ran 5 times faster.
The Records Sharing Value of the Cloud
Are you tired of having multiple versions of the same document so that you can collaborate with co-workers?
According to a new article by Federal Times, “Vendors Make Information Sharing Easier by Giving End Users More Control,” government agencies are adopting records sharing systems to help alleviate the pain of multiple versions of the same document and the inability to share and collaborate on documents with multiple users.
Get Rolling on Cloud Computing
Substantial cost savings and greater efficiencies—the main promises of cloud computing—along with recent government initiatives like the OMB’s Cloud First Policy and the Federal Cloud Computing Strategy are driving numerous agencies to investigate and invest in cloud computing.
On Wednesday, February 23rd, DLT sponsored a webcast with i360Gov.com titled, Best Practices for Achieving Migration to a Cloud Model. During the webcast, viewers learned the steps government agencies should take to move to cloud-based solutions. DLT’s own Van Ristau (CTO), NIST’s Dawn Leaf and In-Stat’s Greg Potter all presented their perspectives on the matter. If you missed it, don’t worry, you can view the archived version here.
Using SOA to Extend Beyond Your Four Walls, Part 1
Within service delivery circles, Service Oriented Architecture (SOA) has moved from buzzword to staple over the past 5 years. The ideas of loosely coupled services that hang together instead of a bespoke monolithic application has changed the software landscape. Decoupling sources and consumers of data provides flexibility and scalability in application and infrastructure design. The hype may have moved on, but what's left behind is a solid, practicable architecture.
Asynchronous, message based interfaces extend the time and space divide further. Reusable components increase the speed of delivery and creation of new services. Diverging application resources drag diverging data sources along the path. Accessing unrelated data sources from a SOA based environment requires some unifying view, be it a consolidated hub that replicates data from different sources or a data federation service that manages abstract aggregation logic. Monolithic siloed applications can be leveraged while being decomposed into component functional service areas.
Oracle Open World Recap Part IV.5.1
Exalogic
Larry Ellison continued his key note from Sunday by reintroducing Exalogic a “high performance server with hardware and middleware specifically designed for running public or private cloud systems. “ We spent a lot of time optimizing Oracle software to run on the Exalogic box,” Larry said. He referred to Exalogic as “one big honkin’ cloud,” and called the system “the fastest computer for running Java applications software” and said “it could be used for application consolidation or for running both public and private cloud systems.” The internal components of the system include:
The Oracle Exalogic Elastic Cloud server combines 64-bit x86 hardware, a total of 30 compute servers with 360 cores, with Oracle middleware such as the Weblogic server, Oracle Coherence data grid software, JRockit Java runtime software and Oracle VM virtualization software. The system uses infiniBand technology (capable of handling 40 gigabits per second) to link its internal components, has 2.8 TB of DRAM, 4TB of read cache and 960 GB of solid-state disk storage. Oracle will offer Linux and the Solaris operating systems with Exalogic.
Green Government Mandates and How to Meet Them
A recent article by a friend of ours, Caron Beesley, editor of [acronym] Online, discussing the innovative steps that the federal government is taking to overcome many of the challenges of “going green” and meet a range of fast-tracked mandates, has been getting a lot of great press lately.
In Fast-Tracking A Greener Government – Meeting Those Mandates, Beesley noted that, as the largest consumer of energy in the U.S. economy, federal government energy efficiency projects have often been hampered by cumbersome infrastructure, regulatory hairballs, and energy upgrade limitations on buildings.
Cloud Computing: Learn the Steps to Get There
The White House recently released its Federal Cloud Computing Strategy on February 8, 2011. The new strategy is in conjunction with the OMB’s Cloud First Policy which is intended to accelerate the pace at which federal agencies implement a cloud strategy. According to the document, this strategy is designed to:
• Articulate the benefits, considerations, and trade-offs of cloud computing
• Provide a decision framework and case examples to support agencies in migrating towards cloud computing
• Highlight cloud computing implementation resources
• Identify Federal Government activities and roles and responsibilities for catalyzing cloud adoption
By now, I’m sure just about everyone is familiar with the many benefits cloud computing offers. So I can safely bet that instead of asking “why,” your question now is “how.” And with this new push released by the White House, there isn’t a better time to get the answer and begin the implementation process.