The GEOINT Symposium is the nation's largest gathering of geospatial intelligence stakeholders from across industry, academia, and government. Hosted by the United States Geospatial Intelligence Foundation (USGIF), the event has become the gathering place for 4,000+ members of the worldwide geospatial community.
Geospatial Intelligence (GEOINT) was recognized as a discipline in the mid 1990s when the imagery and mapping disciplines were combined into a single DoD agency that was later re-named the National Geospatial-Intelligence Agency (NGA). The combination proved that together, these two technologies provided an incredible opportunity for new intelligence and analysis. The term "GEOINT" was coined by the honorable James Clapper and a community of mapping and imagery intelligence analysts began to grow.
The first GEOINT Symposium was held in a hotel meeting room with the expectation of 100 attendees, but even that first event drew many more to the standing room-only sessions. Since then, the Symposium has grown year after year to become the flagship event for networking and professional development among the defense and intelligence communities and others who use geospatial technology including first responders, law enforcement, and beyond. Continue reading
Part of the President's Management Agenda (PMA) calls out leveraging data as a strategic asset for more effective government. In support of this, several pieces of legislation and policy have been created to better enable and even incentivize agencies to make their data available and open for use across government and by citizens.
Federal CIO Suzette Kent recently said that the Federal Data Strategy will be released soon and will prioritize datasets that could help stimulate the economy, protect the nation, and continue important research. The guidelines will present principles that prioritize data security, privacy, and transparency.
This Federal Data Strategy follows the passage of the Open, Public, Electronic, and Necessary (OPEN) Government Data Act at the beginning of the year. This law requires that all non-sensitive government data be made available in machine-readable formats by default. It also creates a Chief Data Officers Council that will address data governance across agencies.
Even before these laws and guidance were released, we've seen how access to data can impact communities. For example, in Asheville, NC, BeLoved Asheville, an activist group of homeless people, launched the Homeless Voice Project. This project filters public crime data using arrestees' addresses. They were able to show that the homeless population was being disproportionally targeted and arrested by highlighting the number of homeless shelter addresses being used. In Norfolk, VA, community groups are using data to show the impact of re-development on communities, highlighting the size of population displacement that would come with gentrification. These groups are finding there is less "shouting across the table" and common ground is easier to find when arguments are backed with data. Continue reading
Don't let the title mislead you, today we're not talking about acquiring Agile services (though, that plays a role) but rather about how government is making their procurement process more flexible and dynamic to meet the needs of federal teams and citizens alike. We've written here about the challenges in government acquisition--from the retiring workforce, to concerns of end-of-year spending, to incompatibility with modern technology. Given these challenges, we've seen a shift in recent years from the "that's the way it's always been" mentality to one of innovation.
There is some guidance on making changes to procurement including the introduction of Other Transaction Authority (OTA), a way to more quickly carry out certain prototype, research, and production projects. OTAs incorporate business practices that reflect commercial industry standards and best practices into its award instruments. But, what is having a greater effect is agencies taking risks and trying new procurement methods on a one-off basis to see what works.
Lesley Field, the deputy administrator of the Office of Federal Procurement Policy, said in an interview, "I see a lot of appetite out there for taking risks, calculated risks and bringing our industry partners along." She went on to talk about how agencies should be willing to try new ways of acquiring goods and services and be willing to learn quickly from mistakes and change course. Also communicating those lessons learned across government is crucial to government-wide procurement reform. Continue reading
Data center consolidation has been a mandated goal in the federal government for a number of years. The introduction of cloud, virtualization, and shared services means the government can run more efficiently with less hardware that no longer requires huge, physical servers to sit in buildings. Many of which were built for the sole purpose of housing servers. Consolidation saves money on technology, the support of that technology and also reduces agency real estate footprints and needs. While agencies have made some strides, the OMB sees the progress to date as going after low hanging fruit and is now challenging agencies to think bigger.
According to a drafted policy issued in November, OMB stated, "Agencies have seen little real savings from the consolidation of non-tiered facilities, small server closets, telecom closets, individual print and file servers, and single computers acting as servers." The push now should be in moving to the cloud and shared services, and looking to commercial third parties to host government data.
More than moving servers and workloads, data center consolidation relies on changing the way agencies manage data. The Data Accountability and Transparency Act was enacted to make information on government spending more transparent. Doing so requires agencies to agree to and implement data standards so that information can be shared across government and openly with the public. This implementation of standards has been a stumbling block for compliance. Continue reading
With a focus on automation and digitization in government, there is a perceived fear that, just like the science fiction films and books warned, robots will take over our jobs (and potentially later, the world). The reality is that while some manual jobs will be "taken over" by machines, there is still a huge need for people to train and double check those technologies. In automating rote functions, we are letting machines do what they do best - quickly capture and compute data -- and freeing humans to do what they do best - make sense of the machine's outputs.
Government agencies are committed to training employees to reskill them into higher value jobs that allow them to not only keep their job, but elevate their skills and place in the organization. It is not surprising that technology will also play a big role in that training.
Virtual Reality (VR) training is not new to government. The Defense Department has been using it for years to create a realistic environment for training soldiers on expensive combat equipment and preparing them for new terrains and environments. Civilian agencies have begun using VR and Augmented Reality (AR) to better connect with citizens, making interacting with government services feel like playing a video game. Taking the lessons learned from Fortune 500 companies, the government can now extend their use of VR to general workforce training. Continue reading