The month of September marks the busiest buying season for the federal government. In the final month of fiscal year 2018, an astonishing $97 billion was spent on 509, 828 contracts. On average, this equates to $3.2 billion per day.
September is also getting busier and busier. Between 2015 and 2018 spending increased by 39%.
It’s the most wonderful time of the year…as the song goes and that is also true of the U.S. Federal IT market right now. The month of September marks the end of the fiscal year and the beginning of the federal government’s annual spending frenzy. Federal agencies scramble to spend what’s left in their budgets, in fear that leaving excess funds will prompt Congress to send less in the following year. We call it “use it or lose it” spending, and it happens every year.
Congress and the Trump administration may not agree on much, but everyone wants to keep pressing federal agencies to strive for modernization with their information technology systems. In hearings just this month, for example, members of the House Veterans Affairs Committee bore in on VA’s struggle to replace its electronic health record system and to modernize its legacy financial and other administrative systems.
Government agencies store large volumes of information. Yet poor data quality and a lack of collaboration across functions can be stumbling blocks to information governance. Some of the data is dirty, messy, and non-standardized making data sharing a challenge. Without holistic data governance, data is chaotic
But some organizations still see data governance as it used to be – siloed projects focused on compliance. Government enterprises spend all their time struggling to implement data governance. And, in the end, no one trusts the data.
When news broke last month that the Pentagon is still using 1970s-era floppy disks to run its nuclear program, most of us expressed incredulity. Unless you happen to work for the federal government that is.
According to federal CIO Tony Scott, the U.S. government spends 76% of its $88 billion IT budget on operating and maintaining out-of-date technologies – that’s three times what is spent on modern systems.
Keeping pace with changes in enterprise-level technology is no easy feat. For education institutions and universities, in particular, making sense of the available options for managing complex operational and technological infrastructures can be mind-boggling.
The importance of cloud computing in this mix can’t be underestimated. Today, nearly 70% of higher education institutions in North America have moved or are moving systems to the cloud, while 50% have adopted cloud-based collaboration systems.
Weeding out fraud is a big priority for the public sector. From unpaid and fraudulent taxes to benefit fraud, governments are turning to data to hit back.
Cloud, enhanced service delivery, improved collaboration, all common goals and challenges for governments of all types. Yet one common thread runs through most agencies, regardless of their strategic goals and missions – over-burdened systems.
“Big data” and the rise in demand for database services puts increasing pressure on IT departments. End users demand access to data at higher speeds and management wants that data delivered and managed at a lower operational cost.
Data stores are expanding at an exponential rate, compromising the performance of servers and applications. This growth also adds to the daily challenges and pains that administrators face – maintaining database health while meeting ever-evolving mission requirements.
On February 11, Red Hat announced its newest release – Red Hat Enterprise Virtualization 3.5 (RHEV for short). This latest upgrade to Red Hat’s open source virtualization platform promises to deliver greater visibility into provisioning, configuring and monitoring of virtualization infrastructures and tighter integration with the OpenStack cloud infrastructure platform.