The Census Bureau's mission is "to serve as the nation's leading provider of quality data about its people and economy." 2020 is a decennial census year where the government is required by Article I, Section 2 of the Constitution to collect data on the population of the country. This data is used to determine the number of seats each state has in the U.S. House of Representatives and inform the distribution of billions in federal funds to local communities. The 2020 questionnaires will begin arriving to homes mid-March. All households receiving a questionnaire are required to fill it out and return it. Those that have not responded will be visited by census takers beginning in May.
The first census took place in 1790, one year after George Washington took office. For this initial census, marshals visited every house and collected data. The process took months and the end results were questioned for accuracy and completeness. Since then, the process by which census data is collected continues to evolve.
In 1890, a punch card system was used for the census. This automation was developed specifically to meet the growing amount of data that needed to be processed. The company that developed this technology went on to become IBM. Moving ahead 130 years, this year's census marks the first time people will be able to submit their responses online.
With the closing of the decade, we thought it would be interesting to look back at the top technology headlines of 2009 and compare them to where the market is today.
Data on the Rise
Big news was the launch of data.gov in late May of 2009. The site was championed by the country's first Federal CTO, Vivek Kundra, as a way to enable citizens to access federal data. In addition to making the government more transparent, the hope was that private sector could use the massive amount of federal data in research and to create innovative programs and solutions. The site launched with 47 data sets and as of the last reporting (June 2017) it now holds approximately 200,000 datasets, representing about 10 million data resources. Beyond these numbers, data.gov's impact has been significant.
Thousands of programs can point to the site as the basis for their development. More importantly, it launched a new way of thinking in government. Agencies stopped being as territorial about their data and slowly but surely became more open to sharing it with one another and with the public as they saw what innovation can happen with simple access. In 2019, the vision of data.gov expanded with the Open, Public, Electronic and Necessary Government Data Act, requiring that nonsensitive government data be made available in machine-readable, open formats by default. Continue reading →
Artificial Intelligence (AI) continues to dominate tech headlines. Now, rather than learning what the technology could mean for government, we're reading about where it's being implemented, and the results being achieved. A recent report found that AI is no longer considered optional, but rather a critical component to managing and using large amounts of data. IT leaders in government are looking to AI to automate routine, data-oriented tasks, ease access to diverse sets of data, prioritize tasks based on the benefit to the organization, and generally keep track of ever-growing streams of data.
The Intelligence Community (IC) has long been a top consumer and analyzer of data in government. Not surprisingly, they have embraced AI technology to supplement the work of analysts by reducing the amount of manual data sorting with machine-assisted, high-level cognitive analysis. AI is being used to help triage so the highly-trained analysts can spend their time making sense of the data collected by looking at the most valuable and seemingly connected pieces.
Health and Human Services (HHS) implemented an AI solution when they needed to quickly procure Hazmat suits to meet the response to an Ebola outbreak. Procurement officials were able to use AI to make like-to-like comparisons among products. After the initial tactical analysis, the acquisition teams were able to use the data gathered on department wide pricing and the terms and conditions to better define parameters for ten categories of purchases.
Despite the successful implementations in many agencies, AI is still in the pilot and introductory phase. The Air Force is making it easier to begin experimenting with AI. Because the DoD has strict rules about what can be put on their networks, it is difficult to introduce new technologies into the production environment. The Air Force has created a workaround with the Air Force Cognitive Engine (ACE) software platform, a software ecosystem that can connect core infrastructures that are required for successful AI development (people, algorithms, data, and computational resources).
HHS is looking to use AI to analyze dated regulations as part of their AI for deregulation project. The pilot has found that 85 percent of HHS regulations from before 1990 have not been edited and are most likely obsolete. Using AI to flag regulations with the term "telegram," for example, will begin the prioritization of data that needs to be looked at by humans.
Fall visits to the farmers market take us back to simpler times when people lived off the land. Today's farmers may provide the same "output" of food, but how they manage the growth and distribution of it has changed dramatically.
The U.S. Department of Agriculture (USDA) was established in 1862 and was nicknamed "The People's Department" by President Lincoln because of its mission to support the farmers that feed the nation. Today, the USDA is focused on providing "leadership on food, agriculture, natural resources, rural development, nutrition, and related issues based on public policy, the best available science, and effective management."
In achieving this mission, the USDA has become a hub for innovation. It was chosen as the first host agency for a modernization Center of Excellence (CoE). Spearheaded by the General Services Administration (GSA), the CoE at USDA was established to accelerate IT modernization across government to improve the public experience and increase operational efficiency. The CoE centralizes top government tech talent and combines it with private sector experts and expertise to implement best practices to move processes and technologies ahead. The CoE is focused on five functional areas: Cloud Adoption, Contact Center, Customer Experience, Data Analytics, and Infrastructure Optimization.
Part of the President's Management Agenda (PMA) calls out leveraging data as a strategic asset for more effective government. In support of this, several pieces of legislation and policy have been created to better enable and even incentivize agencies to make their data available and open for use across government and by citizens.
Federal CIO Suzette Kent recently said that the Federal Data Strategy will be released soon and will prioritize datasets that could help stimulate the economy, protect the nation, and continue important research. The guidelines will present principles that prioritize data security, privacy, and transparency.
This Federal Data Strategy follows the passage of the Open, Public, Electronic, and Necessary (OPEN) Government Data Act at the beginning of the year. This law requires that all non-sensitive government data be made available in machine-readable formats by default. It also creates a Chief Data Officers Council that will address data governance across agencies.
Even before these laws and guidance were released, we've seen how access to data can impact communities. For example, in Asheville, NC, BeLoved Asheville, an activist group of homeless people, launched the Homeless Voice Project. This project filters public crime data using arrestees' addresses. They were able to show that the homeless population was being disproportionally targeted and arrested by highlighting the number of homeless shelter addresses being used. In Norfolk, VA, community groups are using data to show the impact of re-development on communities, highlighting the size of population displacement that would come with gentrification. These groups are finding there is less "shouting across the table" and common ground is easier to find when arguments are backed with data. Continue reading →