Introducing Connector Private Networking: Join The Upcoming Webinar!

Confluent Public Sector Summit Demonstrates Government’s Passion for Data

2023 Public Sector Summit

Learn from subject matter experts about data streaming use cases for the public sector.

The first-ever Confluent Public Sector Summit confirmed the keen interest government organizations have in using data to better empower their missions. A diverse group of over 160 participants that included data scientists, IT architects, regulators, acquisition professionals, operations leaders, and more heard and shared insights into how to best harness the power of data to move missions forward.

Throughout the day, several overarching themes emerged: 

  • Trust – building trust in the data is critical, but so is building trust among data “owners” so that they see themselves as “stewards” who freely share their data.

  • AI – success is dependent on clean, trustworthy data.

  • Everyone is a data professional – we need to train the whole of the workforce to understand how to use the data products now available to them.

Through each session, government professionals shared their challenges, successes, and hopes for the future of data-centric government operations. 

CDOs Driving Data Energy

Chief Data Officer (CDO) is still a relatively new role, and the CDOs that spoke at the Public Sector Summit are embracing the opportunity to define what they and the data they oversee can do for their agency missions. Panelists Captain Brian Erickson, CDAO, U.S. Coast Guard (USCG); Elizabeth Puchek, CDO, U.S. Citizenship and Immigration Services (USCIS); and Carin Quiroga, CDO, Immigrations and Customs Enforcement (ICE) talked about how they are shaping the way their agencies look at and use data. 

The passion they have for the power of data to transform how people work and make decisions came through in the panel discussion as they talked about how they are introducing data strategies to their organizations. 

Quiroga explained how she created “data bodegas” as opposed to data warehouses. This approach looks to mimic more of a shopping experience, highlighting different data products and offering tools people need to use those products, like data dictionaries and APIs. All of the panelists agreed that the bodega approach makes data feel more accessible, which is a key step in getting people to use and trust it. 

Captain Erickson shared that the addition of the A (for AI) in his professional title of CDAO was done to signal to the workforce the importance of AI to the Coast Guard’s mission. USCG has an incredibly young workforce that is open to modern tools and analysis. He has to act as the change agent to liberate data in the traditional silos and legacy apps of the organization. His team was able to illustrate data’s decision-making support when the Coast Guard faced a need to reduce the number of operating locations based on a drop in staffing levels. The ability to analyze what reduction of force in different areas would do to overall operations fed decisions on what units need to continue at current levels and which could close or reduce staffing. 

Puchek discussed the power of visualization for getting buy-in on data sharing and use. Her team created a director’s catalog, which is a suite of high-level enterprise dashboards that directors look at and are available to others in the organization. If the head of the agency is looking at these dashboards, so should everyone else. The ability to easily consume data in a visual way drove one advisor to engage the CDO organization to help address the backlog of unadjudicated events in the humanitarian program. Working together, they determined what data was most critical in moving the cases along. Then, they found ways to bounce those pieces of data off one another to prioritize obvious cases and make decisions more quickly. 

Data Ensures We’re Never in a Fair Fight

The event keynote speaker, Young Bang, Principal Deputy Assistant Secretary of the Army, spoke about how he’s working to meet the data needs of today’s Army. The Army’s modernization goals and joint force vision are well documented, and Bang detailed how data and AI can help meet them. He laid out the guiding principles for Army transformation:

  • Stabilize, simplify, and flatten – put the Army on a “data diet” to reduce technological debt and free up what is available to be able to work on new solutions.

  • Low signature – data has a considerable heat signature. How do we help it hide in plain sight?

  • Constant iteration and innovation – radically improve the speed at which solutions get to the field with software-defined and data mesh-enabled approaches.

  • Intuitiveness – human-centered design solutions are easy to use and reduce training burden.

  • Interoperability – both technical and procedural that abstracts technology at all layers and reimagines operations with automation.

Bang described how the work on the Integrated Personnel and Pay System (IPPS-A) reflects this approach. The project used Apache Kafka® to integrate 38 interfaces to simplify the architecture and introduce a modern interface, making it easier to complete daily tasks. 

Bang highlighted that the shifts in how we use data will require a change in how we acquire technology. He detailed how he is working on developing a digital contracting Center of Excellence (CoE) to help the Army be more flexible in acquiring AI and software. One approach is the concept of a continuous ATO that utilizes continuous monitoring to reduce compliance paperwork. 

Building Resiliency 

Security was discussed with an eye toward the resiliency of the infrastructures that support a data-centric organization. Panelists Dr. Tiina Rodrigue, CISO, Consumer Financial Protection Bureau; Dr. Kelly Rose, Technical Director, National Energy Technology Laboratory; and Harrison Smith, Director, Enterprise Digitalization, IRS, shared the decision cycles involved in building data-sharing platforms. 

With highly segregated systems, it's critical to look at what vulnerabilities bringing it all together will introduce. It’s also critical to understand where the risks really lie. In utilizing the cloud, the technology is not a security risk. Risk is introduced when people misconfigure it or apply the process incorrectly. 

In building systems that are not only secure but also usable, the panel discussed the involvement of three types of users:

  1. People who love the current system.

  2. People who hate the current system.

  3. People who did not even know there was a system.

Only by listening to all of these groups will you achieve innovation by building a system that meets all user expectations. 

Building Efficiency Into Data Sharing 

A panel featuring Commander Johnathan White, Data & Cloud Branch Chief, USCG; Tyson Walker, Passenger Systems Program Director, Customs and Border Protection (CBP); and Matthew Corazza, Cargo Systems Program Director, CBP discussed the key challenges in opening up data for use across organizations. As mentioned in other discussions, this problem is twofold: technical and cultural.

From a technical approach, the speakers highlighted the importance of data deduplication. The end goal is to create a single source of truth for everyone. From reducing storage costs to ensuring everyone is working off the most current data (not a set that was saved two weeks ago on a local drive), centralizing access to data is critical. If done correctly, this single source of truth also addresses the cultural pushback to data sharing—the fear that “they are going to mess up my data.” 

In a data sharing model, SLAs should be implemented for response times. Walker pointed out that you cannot depend on the speed of other people’s systems. If it takes too long to get the data from one system to another, they’ll just download it and put it on a system where they can get the speed they need to complete their jobs. This then leads to duplication and stale data. Beyond SLAs, the panel discussed the need for standards and governance to define what data gets shared and with whom.

Streaming the Future

The event’s Solution Showcase highlighted Kafka implementations that show how immediacy in data sharing impacts missions. RAFT’s Communication Broker is used to coordinate airspace during crisis and conflict by pulling data from various sensors to understand the health of software and data feeds to show that things are working as they should and that the data presented is correct. This information is presented in a way that anyone, not just an engineer, can watch for issues and quickly intervene by taking any anomalous data out of the equation until verified. 

ASRC Federal illustrated how their team consolidates feeds to present information in a way that it can quickly be acted on. For example, the DISA unified situational awareness solution pulls data from different data centers—65 physical and virtual nodes—for analysts to access and use. The company also created the Mission Operations Assistant for National Oceanic and Atmospheric Administration (NOAA), which acts as a search engine that adds key insights into log messages.

The Public Sector Summit showcased the power of data to help the government meet its goals for citizen service, improved security, situational awareness, and ongoing innovation.

Click here for more insights from the day. 

2023 Public Sector Summit

Learn from subject matter experts about data streaming use cases for the public sector.

Avez-vous aimé cet article de blog ? Partagez-le !