Don't let the title mislead you, today we're not talking about acquiring Agile services (though, that plays a role) but rather about how government is making their procurement process more flexible and dynamic to meet the needs of federal teams and citizens alike. We've written here about the challenges in government acquisition--from the retiring workforce, to concerns of end-of-year spending, to incompatibility with modern technology. Given these challenges, we've seen a shift in recent years from the "that's the way it's always been" mentality to one of innovation.
There is some guidance on making changes to procurement including the introduction of Other Transaction Authority (OTA), a way to more quickly carry out certain prototype, research, and production projects. OTAs incorporate business practices that reflect commercial industry standards and best practices into its award instruments. But, what is having a greater effect is agencies taking risks and trying new procurement methods on a one-off basis to see what works.
Lesley Field, the deputy administrator of the Office of Federal Procurement Policy, said in an interview, "I see a lot of appetite out there for taking risks, calculated risks and bringing our industry partners along." She went on to talk about how agencies should be willing to try new ways of acquiring goods and services and be willing to learn quickly from mistakes and change course. Also communicating those lessons learned across government is crucial to government-wide procurement reform. Continue reading
Data center consolidation has been a mandated goal in the federal government for a number of years. The introduction of cloud, virtualization, and shared services means the government can run more efficiently with less hardware that no longer requires huge, physical servers to sit in buildings. Many of which were built for the sole purpose of housing servers. Consolidation saves money on technology, the support of that technology and also reduces agency real estate footprints and needs. While agencies have made some strides, the OMB sees the progress to date as going after low hanging fruit and is now challenging agencies to think bigger.
According to a drafted policy issued in November, OMB stated, "Agencies have seen little real savings from the consolidation of non-tiered facilities, small server closets, telecom closets, individual print and file servers, and single computers acting as servers." The push now should be in moving to the cloud and shared services, and looking to commercial third parties to host government data.
More than moving servers and workloads, data center consolidation relies on changing the way agencies manage data. The Data Accountability and Transparency Act was enacted to make information on government spending more transparent. Doing so requires agencies to agree to and implement data standards so that information can be shared across government and openly with the public. This implementation of standards has been a stumbling block for compliance. Continue reading
The Federal Risk and Automation Management Program, commonly known as FedRAMP, was introduced in 2010 and signed into policy at the end of 2011 as a "standardized approach to security assessment, authorization, and continuous monitoring for cloud products and services." In plain English, it provides a baseline for agencies to determine if a cloud solution is secure enough for them to use. Vendors get FedRAMP certified as a way to prove their solution is ready to plug and play into federal systems.
In recent years, cloud has moved from a curiosity for most agencies to a key part of IT infrastructure. With this change in cloud acceptance and use, FedRAMP has also started to evolve to meet today's needs. Last summer, Rep. Gerry Connolly introduced the FedRAMP Reform Act of 2018 as a more stringent enforcement of the use of FedRAMP guidance. Continue reading
From time to time GovEvents will come across information we feel our members and audience would benefit from. Here's something we wanted to share from Bob Gourley, Founder and CTO at Crucial Point LLC:
For the last decade enterprise architects have all known the importance of engineering continuous connectivity to cloud services. If you want to use the cloud you have to have a path to it.
For parts of the enterprise who may operate at the edge, where connectivity can be an issue, it has been hard to design solutions leveraging public clouds. Depending on the organization, edge users might have some mix of public cloud, private cloud, datacenter access and local compute, all complex and hardly optimized at all. Continue reading
For those of us in the government market, October is the time to break out the Happy New Year noisemakers and celebrate the new government fiscal year (GFY). Each August and September is a frantic race for agencies to spend their remaining budget, which poses opportunity but a lot of hard work for the vendors that want to earn some of this end-of-year shopping spree money. In recent years, the turning of the new fiscal year has also meant uncertainty. From shut downs to continuing resolutions, the switch from one year to the next has not been as smooth as flipping a calendar page.
A group of senators has come forth to raise concerns about this annual end-of-year frenzy. A recent report found that the last week of the fiscal year accounts for 12.3 percent of spending [on IT]. Numerous other reports over the years have found similar statistics. In 2017 this equated to $11 billion in the final week of the year -- almost five times more than the average weekly spending for that year. This spending happens because agencies are afraid if they do not use all the money they are allocated, their budgets will go down in the future. This group of senators, as well as others in government, are looking at options for reforming the system to eliminate the potential waste resulting from this fast spending. Continue reading