Data center consolidation has been a mandated goal in the federal government for a number of years. The introduction of cloud, virtualization, and shared services means the government can run more efficiently with less hardware that no longer requires huge, physical servers to sit in buildings. Many of which were built for the sole purpose of housing servers. Consolidation saves money on technology, the support of that technology and also reduces agency real estate footprints and needs. While agencies have made some strides, the OMB sees the progress to date as going after low hanging fruit and is now challenging agencies to think bigger.
According to a drafted policy issued in November, OMB stated, "Agencies have seen little real savings from the consolidation of non-tiered facilities, small server closets, telecom closets, individual print and file servers, and single computers acting as servers." The push now should be in moving to the cloud and shared services, and looking to commercial third parties to host government data.
More than moving servers and workloads, data center consolidation relies on changing the way agencies manage data. The Data Accountability and Transparency Act was enacted to make information on government spending more transparent. Doing so requires agencies to agree to and implement data standards so that information can be shared across government and openly with the public. This implementation of standards has been a stumbling block for compliance. Continue reading
The Internet of Things (IoT) is made up of webcams, sensors, thermostats, microphones, speakers, cars, and even stuffed animals. All of these connected devices can help individuals and organizations stay connected across geographic distances, keeping tabs on and managing assets from miles away. The data they collect can be combined with other data sets to create actionable advice for better management and service.
This holds incredible promise for local governments and federal agencies charged with maintaining safe operating fleets and facilities. There's also the application for improving the routing of field technicians as well as traffic flow in general. But, as every superhero knows, with great power comes great responsibility.
As with any technology, IoT standards need to be developed for effective and safe use as well as to enable interoperability. NIST has been working on defining standards and recently released Considerations for Managing Internet of Things (IoT) Cybersecurity and Privacy Risks, but no federal agency is currently claiming jurisdiction over IoT policy and rule-making. In this vacuum, the legislative branch is getting involved. This past November, the House passed the SMART IoT Act that tasks the Department of Commerce with studying the current U.S. IoT industry. A Senate bill was introduced to manage what types of IoT devices the government can purchase, ensuring that all IoT tech in government is patchable and has changeable passwords. Finally, states are even weighing in on the proper use of IoT in government. California passed the first IoT cybersecurity law, making device manufacturers ensure their devices have "reasonable" security features. Continue reading
The Continuous Diagnostics and Mitigation (CDM) program, led by the Department of Homeland Security, was designed to fortify the cybersecurity of government networks and systems with capabilities and tools that identify risks on an ongoing basis, prioritize these risks based on potential impacts, and enable personnel to mitigate the most significant problems first. The program was rolled out in phases with phases one and two pretty much complete across government.
Now that agencies know what and who is on their network, they need to move onto phase three - what is happening on the network. This involves installing and managing the network and perimeter security measures. Given that the perimeter now includes mobile devices, securing those devices and the way they access the network is critical to meeting CDM goals. Currently,agencies are mapping out mobile connections at the agency level, and the networks with which agencies are regularly interacting.
The Federal Risk and Automation Management Program, commonly known as FedRAMP, was introduced in 2010 and signed into policy at the end of 2011 as a "standardized approach to security assessment, authorization, and continuous monitoring for cloud products and services." In plain English, it provides a baseline for agencies to determine if a cloud solution is secure enough for them to use. Vendors get FedRAMP certified as a way to prove their solution is ready to plug and play into federal systems.
In recent years, cloud has moved from a curiosity for most agencies to a key part of IT infrastructure. With this change in cloud acceptance and use, FedRAMP has also started to evolve to meet today's needs. Last summer, Rep. Gerry Connolly introduced the FedRAMP Reform Act of 2018 as a more stringent enforcement of the use of FedRAMP guidance. Continue reading
Blockchain is a complex technology that aims to streamline repetitive, data-intensive tasks. It has become more than a hot buzzword in government IT circles, it is already being put into practice.
One way to think of blockchain is as a database that is jointly managed by a distributed set of participants. Adding data requires the "sign off" of everyone in the chain, verifying that the transaction is legitimate. Because of this interconnectedness, it is inherently secure. Every piece is linked to another, changing one piece will impact the rest of the chain (just like that one bulb going out on your Christmas lights) alerting all owners to an issue.
Government agencies are drawn to the security and transparency provided by blockchain to improve the efficiency and stability of processes requiring strict audit trails. NIST has provided guidance to help educate as well as encourage organizations to begin trying out blockchain approaches. Continue reading