Big Data – Federal News Network https://federalnewsnetwork.com Helping feds meet their mission. Thu, 30 Jun 2022 16:58:09 +0000 en-US hourly 1 https://federalnewsnetwork.com/wp-content/uploads/2017/12/cropped-icon-512x512-1-60x60.png Big Data – Federal News Network https://federalnewsnetwork.com 32 32 To improve its customer experience, SSA found an unusual partner from the NFL https://federalnewsnetwork.com/ask-the-cio/2022/06/to-improve-its-customer-experience-ssa-found-an-unusual-partner-from-the-nfl/ https://federalnewsnetwork.com/ask-the-cio/2022/06/to-improve-its-customer-experience-ssa-found-an-unusual-partner-from-the-nfl/#respond Thu, 30 Jun 2022 16:58:09 +0000 https://federalnewsnetwork.com/?p=4130152 var config_4130501 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/dts.podtrac.com\/redirect.mp3\/pdst.fm\/e\/chrt.fm\/track\/E2G895\/aw.noxsolutions.com\/launchpod\/adswizz\/1128\/063022_askciossafema_web_on85_4c4d04a7.mp3?awCollectionId=1128&awEpisodeId=8841da2d-8e4a-4de8-ac4f-b1d04c4d04a7&awNetwork=322"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2018\/12\/AsktheCIO1500-150x150.jpg","title":"To improve its customer experience, SSA found an unusual partner from the NFL","description":"[hbidcpodcast podcastid='4130501']nnThe Baltimore Ravens football team may be better known for its winning ways on the field and its rabid fans in the stands.nnBut the Social Security Administration turned to the NFL team because of its prowess in using data to drive customer experience decisions. It also didn\u2019t hurt that SSA headquarters is located in Baltimore County, Maryland, and many of the staff are big fans of the team.nnPatrick Newbold, the assistant deputy commissioner and deputy chief information officer at SSA, said the Ravens are known for providing a great customer experience for their fans so it just made sense that the agency would reach out.nn[caption id="attachment_4130296" align="alignright" width="300"]<img class="size-medium wp-image-4130296" src="https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2022\/06\/patrick-newbold-300x300.jpg" alt="" width="300" height="300" \/> Patrick Newbold is the assistant deputy commissioner and deputy chief information officer at the Social Security Administration.[\/caption]nn\u201cOne of the questions we asked the Baltimore Ravens was how business intelligence analytics changed their service delivery model?\u201d Newbold said on <a href="https:\/\/federalnewsnetwork.com\/category\/radio-interviews\/ask-the-cio\/">Ask the CIO<\/a>. \u201cThe Ravens shared an excellent use case with us on how data was able to challenge one of their assumptions on fan demographics. Early on, when they started to aggregate that data, that data disproved assumptions they had about their season ticket holders. Their fans were a lot younger than the marketing assumed. So that led them to change the music they played, the food and drinks they served and how they engaged those fans. The data provided the Ravens with some insights to fan demographics that they weren\u2019t necessarily tracking and allow them to market to a growing demographic fan base be exposed.\u201dnnThe Ravens brought their chief data officer or equivalent position to the table to meet with executives from SSA\u2019s CIO, CDO and mission offices.nnLike the way Ravens use data to drive decisions about how they serve their fans, SSA is looking to apply the same concepts to how they deliver their services.nn\u201cWe want to use data to monitor and improve the way we do business and services, and deliver our services to our citizens,\u201d Newbold said. \u201cWe also shared several challenges. One was the importance of data collection. The Baltimore Ravens leverage NFL-wide data as well as their Baltimore Ravens-specific data. They use that data to inform decisions. We, at SSA, want to create a primary source of SSA-wide data that is beyond assumptions and that supports that ad hoc, cross-cutting capability to do some data analytics. While we are completely different organizations, we have the same goals and mission desire when it comes to how we can use data to really inform the way we want to move forward.\u201dn<h2>SSA's scores better than average<\/h2>nThe Ravens, Newbold said, have a mature data and business intelligence practice so gleaming lessons learned can only help SSA, which scored a 64 on the 2021 <a href="https:\/\/www.theacsi.org\/industries\/government\/">American Customer Service Index ratings<\/a>. The federal government\u2019s overall score was 63.4, while the Interior Department received the highest score under the ACSI with a 77.nnSSA\u2019s data for 2020 based on its surveys found 93% of the almost 1,700 respondents rated their field office experience as "satisfactory," but only 47% called it "excellent."nnNewbold said among the biggest lessons learned from the conversation with the Ravens were about the importance of data governance, because the business intelligence platforms and tools are only as good as the data being put into those capabilities.nn\u201cKey points that we learned from Baltimore Ravens and throughout the discussions is really having that strong governance, but also they highlighted how they use data as a tool, not as the final answer,\u201d he said. \u201cThat resonates with us because as we invest more beyond technologies as an agency, we also must recognize that other factors inform decisions, so data is critical and important, but not the only factor.\u201dnnThe Ravens are just one of several public and private sector organizations SSA is meeting with to learn more about how they serve their customers.nnNewbold said SSA also has met with JP MorganChase, the Federal Retirement Thrift Investment Board, Fannie Mae and the Target Corp.nn\u201cWe also met with a couple of thought leaders since June, the former General Motors CIO Ralph Szygenda and the former IRS Commissioner Charles Rossotti,\u201d he said. \u201cWe take these conversations and we've highlighted about three important lessons learned from these conversations, and we are baking those into our strategy. They are around governance, data and culture.\u201dn<h2>New strategy coming<\/h2>nNewbold said SSA is updating its digital transformation strategy to include the customer experience lessons learned from all of these conversations.nnSSA is partnering with the U.S. Digital Service on their modernization strategy and effort.nnNewbold said his office and the mission areas are working with USDS to further expand their understanding of their customers and their journey to use SSA services.nn\u201cA key objective and expansion of our digital service offerings is a redesign of our website to enhance the user experience. To improve the customer service, we plan to deepen our understanding of our customers, including what drives their evolving service. We will learn about our customers\u2019 journeys from various service channels and touch points, and one of those is a voice of the customer feedback. We want to capture real-time customer feedback, not only to use that feedback to assess what we have in place that is working, but to identify customer pain points to help us design those future digital services.\u201dnnTo better understand those customer journeys, SSA and USDS held about 65 different sessions with multiple groups of people. This led to SSA using human-centered design techniques for the new <a href="https:\/\/blog.ssa.gov\/building-a-better-ssa-gov\/">beta version<\/a> of their website that launched in April.nn\u201cFor many of our services, and especially on mobile devices, we really want to ensure that we offer more digital capabilities that can be leveraged on mobile devices and from any location in it. We released an application that allows customers to express a protective intent to file for Social Security supplemental security income benefits online,\u201d he said. \u201cWe have also prioritized within our plan the design and the mobile accessible online process that will upload forms and other documentation.\u201dnnNewbold added SSA has received positive feedback so far from the upgrades and plans to expand its interactions and testing with customers.n<h2>Reducing the burden on customers<\/h2>nGoing forward, Newbold said SSA plans to continue to meet with the Ravens and other private sector organizations on a regular basis.nnHe said all the different public and private sector organizations help the agency learn more about how they can drive better customer experience. SSA also has begun to implement a customer relationship management (CRM) platform to further its efforts.nn\u201cBy reducing the burden on the public, we want to eliminate requirements to conduct business in person, present hard copies of original documents, remove requirements for signatures on a document or provide electronic signing options. These objectives will require SSA to reimagine business processes, program policies and enabling technologies,\u201d Newbold said. \u201cWe also want to modernize our enterprise IT systems. For example, our system that administers benefits have been cited by GAO as one of the 10 IT systems across the executive branch in most need of modernization. We have begun to modernize the claims intake and adjudication software. But we want [to] continue to finish that work and retire the legacy systems and modernize our benefits system remains a focus to us.\u201dnn "}};

The Baltimore Ravens football team may be better known for its winning ways on the field and its rabid fans in the stands.

But the Social Security Administration turned to the NFL team because of its prowess in using data to drive customer experience decisions. It also didn’t hurt that SSA headquarters is located in Baltimore County, Maryland, and many of the staff are big fans of the team.

Patrick Newbold, the assistant deputy commissioner and deputy chief information officer at SSA, said the Ravens are known for providing a great customer experience for their fans so it just made sense that the agency would reach out.

Patrick Newbold is the assistant deputy commissioner and deputy chief information officer at the Social Security Administration.

“One of the questions we asked the Baltimore Ravens was how business intelligence analytics changed their service delivery model?” Newbold said on Ask the CIO. “The Ravens shared an excellent use case with us on how data was able to challenge one of their assumptions on fan demographics. Early on, when they started to aggregate that data, that data disproved assumptions they had about their season ticket holders. Their fans were a lot younger than the marketing assumed. So that led them to change the music they played, the food and drinks they served and how they engaged those fans. The data provided the Ravens with some insights to fan demographics that they weren’t necessarily tracking and allow them to market to a growing demographic fan base be exposed.”

The Ravens brought their chief data officer or equivalent position to the table to meet with executives from SSA’s CIO, CDO and mission offices.

Like the way Ravens use data to drive decisions about how they serve their fans, SSA is looking to apply the same concepts to how they deliver their services.

“We want to use data to monitor and improve the way we do business and services, and deliver our services to our citizens,” Newbold said. “We also shared several challenges. One was the importance of data collection. The Baltimore Ravens leverage NFL-wide data as well as their Baltimore Ravens-specific data. They use that data to inform decisions. We, at SSA, want to create a primary source of SSA-wide data that is beyond assumptions and that supports that ad hoc, cross-cutting capability to do some data analytics. While we are completely different organizations, we have the same goals and mission desire when it comes to how we can use data to really inform the way we want to move forward.”

SSA’s scores better than average

The Ravens, Newbold said, have a mature data and business intelligence practice so gleaming lessons learned can only help SSA, which scored a 64 on the 2021 American Customer Service Index ratings. The federal government’s overall score was 63.4, while the Interior Department received the highest score under the ACSI with a 77.

SSA’s data for 2020 based on its surveys found 93% of the almost 1,700 respondents rated their field office experience as “satisfactory,” but only 47% called it “excellent.”

Newbold said among the biggest lessons learned from the conversation with the Ravens were about the importance of data governance, because the business intelligence platforms and tools are only as good as the data being put into those capabilities.

“Key points that we learned from Baltimore Ravens and throughout the discussions is really having that strong governance, but also they highlighted how they use data as a tool, not as the final answer,” he said. “That resonates with us because as we invest more beyond technologies as an agency, we also must recognize that other factors inform decisions, so data is critical and important, but not the only factor.”

The Ravens are just one of several public and private sector organizations SSA is meeting with to learn more about how they serve their customers.

Newbold said SSA also has met with JP MorganChase, the Federal Retirement Thrift Investment Board, Fannie Mae and the Target Corp.

“We also met with a couple of thought leaders since June, the former General Motors CIO Ralph Szygenda and the former IRS Commissioner Charles Rossotti,” he said. “We take these conversations and we’ve highlighted about three important lessons learned from these conversations, and we are baking those into our strategy. They are around governance, data and culture.”

New strategy coming

Newbold said SSA is updating its digital transformation strategy to include the customer experience lessons learned from all of these conversations.

SSA is partnering with the U.S. Digital Service on their modernization strategy and effort.

Newbold said his office and the mission areas are working with USDS to further expand their understanding of their customers and their journey to use SSA services.

“A key objective and expansion of our digital service offerings is a redesign of our website to enhance the user experience. To improve the customer service, we plan to deepen our understanding of our customers, including what drives their evolving service. We will learn about our customers’ journeys from various service channels and touch points, and one of those is a voice of the customer feedback. We want to capture real-time customer feedback, not only to use that feedback to assess what we have in place that is working, but to identify customer pain points to help us design those future digital services.”

To better understand those customer journeys, SSA and USDS held about 65 different sessions with multiple groups of people. This led to SSA using human-centered design techniques for the new beta version of their website that launched in April.

“For many of our services, and especially on mobile devices, we really want to ensure that we offer more digital capabilities that can be leveraged on mobile devices and from any location in it. We released an application that allows customers to express a protective intent to file for Social Security supplemental security income benefits online,” he said. “We have also prioritized within our plan the design and the mobile accessible online process that will upload forms and other documentation.”

Newbold added SSA has received positive feedback so far from the upgrades and plans to expand its interactions and testing with customers.

Reducing the burden on customers

Going forward, Newbold said SSA plans to continue to meet with the Ravens and other private sector organizations on a regular basis.

He said all the different public and private sector organizations help the agency learn more about how they can drive better customer experience. SSA also has begun to implement a customer relationship management (CRM) platform to further its efforts.

“By reducing the burden on the public, we want to eliminate requirements to conduct business in person, present hard copies of original documents, remove requirements for signatures on a document or provide electronic signing options. These objectives will require SSA to reimagine business processes, program policies and enabling technologies,” Newbold said. “We also want to modernize our enterprise IT systems. For example, our system that administers benefits have been cited by GAO as one of the 10 IT systems across the executive branch in most need of modernization. We have begun to modernize the claims intake and adjudication software. But we want [to] continue to finish that work and retire the legacy systems and modernize our benefits system remains a focus to us.”

 

]]>
https://federalnewsnetwork.com/ask-the-cio/2022/06/to-improve-its-customer-experience-ssa-found-an-unusual-partner-from-the-nfl/feed/ 0
Creating a safe space for IoT https://federalnewsnetwork.com/technology-main/2022/06/creating-a-safe-space-for-iot/ https://federalnewsnetwork.com/technology-main/2022/06/creating-a-safe-space-for-iot/#respond Tue, 21 Jun 2022 21:30:34 +0000 https://federalnewsnetwork.com/?p=4112837 IoT Security Month — June 21, 2022

The innovative world of Internet of Things means industry and government can build things better, stronger faster.  They can gather more information, and quickly integrate it. However, there is a price to moving that much data around at that speed. It also means keeping pace with the security concerns in a sophisticated, quickly-evolving environment. The environment of cloud storage and remote sensors is under constant cybersecurity threats.

“It’s definitely a balancing act between using the latest technologies, but also making sure they’re secure at the same time,” Tim Mierzwa,  enterprise strategy lead for the information technology resources branch at the National Center for Advancing Translational Sciences (NCATS) said on Federal Monthly Insights — IoT Security. NCATS uses IoT to create biological and chemical profiling that aids in the development of drugs and treatments in the medical field.

“From a clinical perspective, you can gather all sorts of  health metrics from heart rates, or insulin levels, things like that. And, you do need to sort of minimize or anonymize that data as well. Because that data in the wrong hands can definitely be very dangerous,” Mierzwa said on Federal Drive with Tom Temin.

NCATS has seen steady growth since its inception in 2012, according to Mierzwa. Starting small and growing meant it only got big enough to form its own cybersecurity division in 2019. As they  moved forward with developing new technologies, the security team  had to move fast to keep up and maintain secure data. That meant developing protocols and governance plans to safeguard large amounts of data  containing personal information.

At FEMA, IoT sensors send information that allows it  to move with greater speed and accuracy in emergencies. For example, AT&T provides a service called FirstNet for first responders.

“I get signal on FirstNet in garages where my T-Mobile personal phone has no hope of working. So, you know, the utilization of being able to have a robust network like that, that can handle communications for all these devices is critical to what we do,” said James Rodd, cloud portfolio manager at FEMA.

“Security is a massive concern for us. Obviously, responding to emergencies, the last thing we want to happen is some kind of security attack that would prevent us from doing that,” Rodd said. While keeping up with protocols is a constantly evolving process, he said sometimes nothing is more important than remembering the basics.

“One of the most critical things that we tend to not pay enough attention to is your baseline updates and stuff like that, like just making sure that your that your mobile devices are updated to the latest firmware, that you’re aware of any security,” he said.

Part of the network that  FEMA relies on involves sensors in remote areas. Those sensors can detect floods and wildfires. sometimes before anyone locally  noticed a problem. As those networks grow,  their security also has to improve and keep evolving.

“Unfortunately, in FEMA, sometimes we kind of silo ourselves because we’re responding to an incident. And we put procedure and policy and effect and then find out that it doesn’t meet the requirements of our executive order or whatever, zero tolerance. And then we have to go and kind of backwards engineer our solution, which we’ve already been utilizing. I hate to say it, but when do we find out that it’s not good, usually during an audit, you know, and that’s not a good time to be finding out. So I would definitely say, networking is a huge factor and making sure you go to the sources,” Rodd said.

]]>
https://federalnewsnetwork.com/technology-main/2022/06/creating-a-safe-space-for-iot/feed/ 0
State Dept. ‘Data for Diplomacy’ winner recognized for COVID-19, air quality projects https://federalnewsnetwork.com/all-about-data/2022/06/state-dept-data-for-diplomacy-winner-recognized-for-covid-19-air-quality-projects/ https://federalnewsnetwork.com/all-about-data/2022/06/state-dept-data-for-diplomacy-winner-recognized-for-covid-19-air-quality-projects/#respond Mon, 20 Jun 2022 22:51:09 +0000 https://federalnewsnetwork.com/?p=4111291

From the distribution of COVID-19 vaccines to personnel overseas, to tracking the air quality at U.S. embassies and consulates, the State Department is looking to make data-driven decisions in all aspects of its mission.

To prioritize that focus under its recent Enterprise Data Strategy, the department last month recognized five winners of its first annual Data for Diplomacy Awards.

Dr. Molini Patel, with the department’s Bureau of Medical Services, won one of this year’s individual awards for her work using data to track COVID-19 infections among the department’s workforce and flagging hotspots that faced the greatest need for vaccines once they became available.

The award also recognized Patel’s work tracking air quality data at 80 U.S. diplomatic posts around the world.

The department, in a press release announcing the winners, said Patel “completely revolutionized MED’s ability to gather data and mold it into products that are easily understood and used by policymakers both within the bureau and the Department at large.”

Patel said in a recent interview that the Bureau of Medical Services, prior to the pandemic, focused mainly on providing patient care to individuals, and not collecting big data and using it for decision-making.

“We don’t have a culture necessarily of producing big data for decision-making, but we needed that during the COVID pandemic,” Patel said on the latest episode of All About Data.

The Bureau of Medical Services has nearly 300 health units at diplomatic posts. In the early days of the pandemic, a task force within the bureau established a system for collecting information on department personnel who were infected with COVID.

“At the beginning, there was a very crude system where overseas health units were calling into our task force that we established within a few days, and we were just tracking those cases on an Excel sheet,” Patel said.

Within a few days, Patel said the bureau sent its first report to the department leadership on how many personnel were sick with COVID, which helped leadership make decisions on mission operations.

Patel said that the bureau, at the same time, began developing a more structured data reporting system and collection system.

As the pandemic evolved, the bureau used its data to understand where the department’s COVID hotspots were and where to send a limited supply of vaccines.

“We tried to gather as much information about how COVID was impacting individual locations, be it COVID cases among our own personnel, be it COVID cases in the general population. Hospitals, death data, whatever many science-driven metrics were available, inform that decision into where we were going to send those vaccines,” Patel said.

The department also recognized Patel for her work using data to air pollution at posts overseas – the mission that originally brought her to work at the agency.

For some context on air pollution abroad, about 80% of 300 State Department posts abroad have annual air particle levels above U.S. health standards.

“We know that it’s a widespread problem, but not all of those locations with high air pollution levels have data on air quality,” Patel said.

The department’s air monitoring program office created a smartphone app, ZephAir. Patel and her team have laid the groundwork for supplying the data that powers the app.

The app displays air quality metrics for the department’s own air monitors, but also displays data from several local governments that have formed partnerships with the state department.

“If it’s good air quality, and it’s a great day to be outside, that’s what the health messaging will say – ‘It’s a great day to be outside, take advantage of this great day.’ At higher pollution levels, for example, if it’s unhealthy, the app will display a message saying. ‘Take it easy, reduce your time and intensity of outdoor activity.’ At the worst air quality, hazardous health messaging will say to consider moving indoors,” Patel said.

The app also can be programmed to send a person an alert, based on a location’s air quality, or if they have a heart, lung or other medical condition that makes them more susceptible to air pollution.

“There are a lot of air quality apps out there run by commercial entities, but they don’t use the Environmental Protection Agency’s air quality index. We have a partnership with the EPA. We obviously believe in what they’re doing, in terms of using, communicating the right science-based metrics, and so we wanted to communicate air quality using that air quality index. The only way to do that was to create our own app,” Patel said.

Patel said that air pollution typically follows season trends, and the department can use that knowledge to help advise personnel on air quality.

“We can put out communications to our personnel directly, on here’s when air pollution is expected to be bad. We want you to be prepared ahead of time, make sure you have your air room air cleaners with new filters, make sure that you have access to timely air quality data so that you can plan out your outdoor activities,” she said.

In most locations, Patel said air pollution is tied to seasonal trends, such as weather or local practices like agricultural burning.

“We understand in many places what the cycle of air pollution will be, what months we’ll have really high air pollution, what months will be low air pollution,” Patel said.

Patel said the department started its air monitoring program in Beijing around the 2008 Summer Olympics. The program soon spread to other posts and China, then continued into facilities in India.

From there, Patel said there was a wide department-wide departmental effort to offer air monitored installation at overseas posts. The department currently has about 80 posts that have their own air monitors.

]]>
https://federalnewsnetwork.com/all-about-data/2022/06/state-dept-data-for-diplomacy-winner-recognized-for-covid-19-air-quality-projects/feed/ 0
DLA preparing for data analytics to become ‘weapon system of the future’ https://federalnewsnetwork.com/defense-main/2022/06/dla-preparing-for-data-analytics-to-become-weapon-system-of-the-future/ https://federalnewsnetwork.com/defense-main/2022/06/dla-preparing-for-data-analytics-to-become-weapon-system-of-the-future/#respond Mon, 20 Jun 2022 19:11:28 +0000 https://federalnewsnetwork.com/?p=4108445 The Defense Logistics Agency is working to grow an enterprise data management division within the office of its chief data and analytics officer, Lindsey Saul. She said DLA is doing this in order to try to drive its data maturity level up from two to three.

“In order to do that we’re focusing on the crown jewel of our strategic plan, which is this idea of VAULTIS, which stands for visible, accessible, understandable, linked, trustworthy, interoperable and secure,” Saul said during a June 16 Digital Government Institute webinar. “It’s a fancy name that the DoD came up with that really captures the key elements for what we strive for our data to be. There are a number of drivers, like I said; the Joint Staff and [Office of the Secretary of Defense] are trying to bring us along in this logistics operation space to make sure that our data is ready to go. And that we are, as it states in our DLA strategy, ready at a moment’s notice to make data-driven business decisions.”

Those efforts have come with a lot of questions. Those questions have largely been variations on “why now?” Leaders frequently ask Saul why data management suddenly requires more resources. Her response is that data has largely been managed in silos for the past few decades, and it gets migrated from system to system. But modern data strategies and use cases require a new approach.

“In today’s day and age, where data is is deemed a weapon — or they even say data analytics platform is almost like a weapon system of the future for the Department of Defense — we really need to be prepared and equipped to have access to our data very quickly, be able to understand our data, and not do what we’ve done in the past, which is really rely on a single person who has all of the knowledge,” Saul said. “And instead, we need to make sure that it’s well documented, we have the right data dictionaries in place, the correct metadata and metadata management tools to link the lineage of the data to the present day. ”

That makes cross functional applications and collaborations easier. For example, Saul said, when the Ukraine crisis began, DLA was able to access the data and build a number of products within just a few days. And that means a lot when resources are faster able to get to people whose lives are on the line. Similarly, DLA has also been able to create applications in conjunction with the Department of Health and Human Services as part of the COVID 19 response.

DLA is also working on an initiative called Information Visualization Evaluation, using DoD’s Advana analytics platform.

“We really have the opportunity to see how that platform works, and leverage the data that we’ve been sharing over the past six months or so with OSD and Advana. So we’ve shared over 30 data sources, [and] made a number of system connections to Advana,” she said. “DLA Information Visualization Evaluation was really an opportunity to see how we could enhance certain products that were already there, that the Joint Staff J4 had already been producing for a common operating picture, with class three energy or fuels as well as class one operational rations. And we also took it a step further with class one to have one of our developers go into the platform and utilize their developing skills to build out a new application that provided insight into the positioning and whereabouts of our items as far as operational rations go. So that was really a big step for us.”

]]>
https://federalnewsnetwork.com/defense-main/2022/06/dla-preparing-for-data-analytics-to-become-weapon-system-of-the-future/feed/ 0
After a long-term study, evidence-based decisions need trustworthy data https://federalnewsnetwork.com/open-datatransparency/2022/06/after-a-long-term-study-evidence-based-decisions-need-trustworthy-data/ https://federalnewsnetwork.com/open-datatransparency/2022/06/after-a-long-term-study-evidence-based-decisions-need-trustworthy-data/#respond Thu, 16 Jun 2022 16:54:10 +0000 https://federalnewsnetwork.com/?p=4105725 In the push for evidence-based decision making, data-driven studies are scrutinized more and more at federal agencies. Long-term studies that rely on multiple stakeholders are vulnerable to environmental changes, technological barriers and personnel cooperation — all of which have an outcome on the data.

Trust underpins evidence-based innovation, but for Teri Caswell, a broadband program specialist at the National Telecommunication & Information Administration, all parties still need to agree on their definition of trustworthiness.

“Is the evidence that’s been provided or sought after trusted because it’s been tried and true? Is it trusted because of the community and the audiences of the people who are touching it or defining it, or presenting it?” she said as part of the Performance Institute’s 2022 Government Performance Summit on Wednesday. “I personally believe that if I am part of a compilation of evidence or artifact that has been and there’s a delineation there as well, but if I can present it once and then, when asked, present it again in a different labeling, packaging, compilation — however you want to phrase it — it still has to be trusted.”

In other words, context matters.

Trust is also important to Shonda Mace, a project manager in the Texas General Land Office’s Community Development and Revitalization who has experience working with FEMA and federal Department of Housing and Urban Development on long-term disaster recovery. After Hurricane Harvey in 2017, her office is conducting regionalized flood studies, which she said were purposely regionalized because local communities often do not communicate with each other, sometimes because of a lack of trust. Her team has to be that reliable go-between.

“So one big thing we’re doing is we’re working with not just the communities, but also other state agencies, and federal agencies to break down silos and work together,” Mace said. “If you don’t have the trust amongst the other agencies and your partners, if you don’t have the trust in most communities, you’re not going to get the information you need to move this project forward.”

With long-term studies, it can be difficult to keep all stakeholders engaged over time. Mace said communities want fast answers and, after a natural disaster has passed, the energy for impact studies can fade. She said it takes a balance of not exhausting stakeholders with outreach but also not waiting so long between outreach that they forget about the study altogether. While multiple agencies in Texas are conducting similar studies to those of her team, they must be careful not to duplicate efforts funded by federal dollars or else relinquishing that money.

Caswell added that there should be client knowledge management in the background of long-term studies. Worldly and environmental considerations can change over the course of the study, such as budgetary cuts, political shifts or another entity assumes the program area.

“The positive side of that is, the more willing we are to look at evidence-based criteria to drive innovation, we should be seeking more than one or even 100 inputs to that innovation design, lest we become, a reputation of doing things in a vacuum and we didn’t consider 80% of our benefactors,” she said.

Her recommendation was to track the key words and phrases that change over the course of multi-year studies, a vocabulary list or checklist of sorts, to maintain some level of consistency in the data so that questions are adequately answered by the end.

She also spoke to the question of whether or not to share information as you go, as opposed to waiting until the end of a study to show stakeholders the data. The anecdotal knowledge subject-matter experts or senior advisors can potentially add to a report headed to an elected official, but it may be hard to digitize. In this case, Caswell said, it helps to have an iterative approach to informing stakeholders and reviewing the study with them before completion.

“I am done with the days of writing a summary report before we’ve even looked at the data. Let’s get it on a paper, let’s get it on report, get people around our camera or table, whatever it takes, and start recognizing what it does look like and are we on the correct path? And if we’re not, there’s your first [knowledge management] piece, right?” she said. “We went this way to prove or disprove the hypothesis, we’ve got a course correction we need to affect, we’re notifying all parties, we’re having a conversation, and we’re building the trust in the process, not just the report.”

However, she said, a major consideration that complicates data collection from stakeholders is the technological requirements involved when partnering with federal agencies. Normalizing technology at the federal level takes a long time, and as cybersecurity requirements increase so do the possible challenges for stakeholders to submit data for studies. Mace’s office encountered this on their contract with the Army Corps of Engineers to review modeling. Sharing large items via box or SharePoint were no longer options as USACE’s file sharing system was not large enough.

Yet, the Texas General Land Office saw this as an opportunity to innovate. Mace said the GLO is working on a new Texas Disaster Information System with the University of Texas and Texas A&M University, “where our vendors can put models into there and USACE can go in and access them and get and get those models.”

]]>
https://federalnewsnetwork.com/open-datatransparency/2022/06/after-a-long-term-study-evidence-based-decisions-need-trustworthy-data/feed/ 0
Protecting encrypted data in a quantum computing world https://federalnewsnetwork.com/commentary/2022/06/protecting-encrypted-data-in-a-quantum-computing-world/ https://federalnewsnetwork.com/commentary/2022/06/protecting-encrypted-data-in-a-quantum-computing-world/#respond Fri, 10 Jun 2022 21:20:14 +0000 https://federalnewsnetwork.com/?p=4098241 In today’s globally connected world, encryption keeps you, your finances, your personal “secrets,” and your information safe.

Encryption is used everywhere today: social media, shopping, banking, email and more. Many other essential applications rely upon cryptography, or the practice of turning plaintext information into a scrambled ciphertext to keep private data secret and secure. When a bad actor successfully decrypts data, your information can be used to steal your identity, send emails pretending to be you, transfer your money, or worse. Here is the unwelcome news: Technology experts and security leaders believe this type of crime will be regularly committed by the end of the decade.

With today’s computers, sometimes called “classical computers,” it would take approximately 300 trillion years to break an RSA-2048-bit encryption key. A new type of computer called a “quantum computer” will perform the same calculation in 10 seconds. Because of the development of this new class of computers, keeping our information, financial details and personal secrets protected is at risk today.

On the plus side, quantum computers also offer many new and exciting possibilities. They will be able to help identify new pharmaceuticals, enable banks to provide better financial returns, and reduce the time and energy needed to manufacture and ship goods.

So how did we end up in this precarious state? Today’s economy and habits have resulted in all of our sensitive personal data being transferred on the internet and stored in vast encrypted repositories. Social media, email, cloud storage and health records encourage us to trust businesses with the secure storage and transmission of our information.

Most businesses believe their approach to encryption keeps your information out of unauthorized hands. Overall, today this works well; encrypted webpages (shown with a padlock in the address bar) are secure and have become default under a scheme known as “HTTPS Everywhere.” Encrypted web traffic now accounts for approximately 89% of all traffic traveling over the public internet. This includes banking and shopping, but also web pages where you are less likely to have sensitive information, such as those sites for local restaurants.

To understand how encryption has come to be “everywhere” today, we must know why it is used and how it has been used historically to protect sensitive data.

As with many inventions, the first types of encryptions were a product of war. The ancient Spartans wrote a message with a certain periodicity that, when wrapped around a rod of the correct dimensions, would correctly space out the intended message. Any rod of the wrong size would cause incorrect spacing and cause the message to become scrambled and illegible. Julius Caesar used the first cyphertext via the “Caesar Cipher” that simply shifted characters by three positions: A goes to D, B becomes E, etc. Naturally, this was not an exceedingly difficult code to crack, so increasingly sophisticated versions of ciphers have been developed to secure the sensitive contents of the underlying message.

IBM pioneered modern cryptography (digital encryption) in the early 1970s. Known as the Data Encryption Standard, or DES, it became the U.S. national standard for encryption. DES remained in use until it was cracked in the late 90s and was replaced by the Advanced Encryption Standard (AES), which is still in use today. Another equally significant event was also derived in the 1970s when Diffie-Hellman published its seminal work on key exchange. Soon, the RSA cryptosystem leveraged the Diffie-Hellman exchange to support secure exchange as an algorithm. Today, AES combined with RSA’s algorithm allows for the secure transfer of information on the public internet via an interaction of Public Key encryption to send the shared symmetric key privately between two parties.

Using both algorithms helps overcome weaknesses: Public Key is slow whereas AES is quick; AES requires both parties to know the key to decrypt data while adding Public Key enables the shared secret symmetrically between both parties. This is overwhelmingly today’s most popular encryption approach, protecting the world’s vast amounts of private data.

Public Key Infrastructure (PKI) has stood the test of time, protecting data for many decades, and will continue to do so for many decades to come. It has not been static either, as the PKI algorithms have been adjusted to increase security as computers have become more powerful and are more likely to crack less sophisticated encryption.

There is a massive upgrade to PKI that will cause almost every internet-connected device to update and change its underlying algorithms to a more secure encryption scheme. The governing body for encryption, The National Institute of Science and Technology, the National Security Administration, Congress and the White House have all demanded action on accomplishing this upgrade as soon as possible.

Why now? The answer is quantum computing. Scientists anticipate that quantum computing will compromise the Public Key cryptosystem used by PKI for the initial key exchange in a few years. Even Alphabet’s (Google’s parent company) CEO Sundar Pichai has stated that “[I]n a five to ten-year time frame, quantum computing will break encryption as we know it.”

The fallout from PKI encryption being broken would be massive. A recent study by the Hudson Institute has estimated that a successful quantum attack could cause up to $2 trillion in damage to the U.S. economy alone. Collateral damage would certainly expand beyond these vast numbers. Government, financial institutions, healthcare and critical infrastructure represent only a tiny fraction of sectors at risk.

This upgrade is not a simple process. The need to protect against quantum computer risks was not anticipated, so there is not an easy way to upgrade a device that is using current PKI. Systems will have to be patched either by appropriately skilled internal corporate networking professionals for on-premises deployments or by the hosting company for those in the cloud.

As with many new technologies, there are multiple solution designs and no standard for how a given device talks to other devices on the network and the internet; a one size fits all option will not work for devices. It may not be feasible to upgrade all devices on a network simultaneously; a one size fits all option for networks also will not work.

Companies need to start planning for this near-horizon risk today. Fortunately, there is a burgeoning post-quantum encryption market where companies recognize these challenges and are rising to address them. Any solution in the post-quantum networking space must be able to solve the short-sighted approach to current network security that caused this lurch in the upgrade and the highly variable device and network needs for each organization and business sector. Software-only update approaches allow for this upgrade’s endlessly flexible and extensible management to meet the needs of today’s diverse organizations.

Dave Krauthamer is CEO of QuSecure.

]]>
https://federalnewsnetwork.com/commentary/2022/06/protecting-encrypted-data-in-a-quantum-computing-world/feed/ 0
Leveraging the DoD Data Strategy in 2022 https://federalnewsnetwork.com/commentary/2022/06/leveraging-the-dod-data-strategy-in-2022/ https://federalnewsnetwork.com/commentary/2022/06/leveraging-the-dod-data-strategy-in-2022/#respond Wed, 08 Jun 2022 19:16:47 +0000 https://federalnewsnetwork.com/?p=4094618 In the year since the Department of Defense Data Strategy was created, the DoD continuously makes strides toward its goal of transforming into a data-centric agency. The strategy demonstrates organizational awareness of data’s role as a strategic asset for keeping the warfighter safe at home and abroad.

The implementation of a structured strategy helps enable the DoD to maximize this important asset, allowing it to incorporate artificial intelligence to improve data management processes and better inform everyone from the leadership level to the tactical warfighter. Advancements like automation improve how that data is protected and recovered in a disaster.

As the DoD becomes more data-centric, the strategy’s goals continue to pave a path for success, even a year later. These goals include recruiting and retaining information technology professionals, ensuring the interoperability of data and properly backing up data.

Supporting talent challenges

Recruiting, training and retaining information technology and data professionals is one of four named essential capabilities to help DoD reach the goals of its strategy. With strategic goals like making data visible, accessible and secure, DoD will proactively minimize data-sharing challenges. However, like many government agencies, the DoD faces a shortage of IT professionals in AI, ML and data science. Recommendations of the Government Accountability Office, including creating a digital service academy to help train the next generation of DoD IT professionals, can support talent challenges.

Top IT talent is critical to the overall success of not only the DoD Data Strategy, but DoD overall. The DoD’s mission is fundamentally the protection of the American people. Service within the DoD is therefore a high public calling, an opportunity to be a part of something greater than one’s self. This core sense of service and higher purpose must be emphasized, fostered and inculcated in hiring and retention efforts.

For top tech talent, service within DoD presents the opportunity to be on the cutting edge of technological development as the DoD has also been and continues to be a leading technological innovator. It was the U.S. government that harnessed the power of the atom, that put man on the moon and invented the internet.

Streamlined and expedited hiring authorities, the opportunities for accelerated advancement and promotion for deserving personnel, separate pay scales for the IT workforce, accelerated student loan repayment programs, expense-free, advanced schooling opportunities at leading universities and job placement within cutting edge innovation projects are some of the ways that government can incentivize the recruitment and retention of IT talent.

Transformation to improve recruitment and retainment efforts is important for the DoD to get closer to reaching their data modernization goals as well as furthering their ability to increase data accessibility awareness among warfighters.

Ensuring interoperability, accessibility of data

The interoperability and accessibility of mission-critical data is key in ensuring information can be used strategically and tactically. Once met, these goals will allow data to reach those who need it reliably and automatically.

Data needs to be able to be accessible from a wide array of sources, including on weapons systems and military vehicles, like fighter jets. When that data is accessed, it must be secure to avoid loss due to user error or disasters, regardless of the platform it’s stored on. This means that data protection and backup solutions — crucial to ensuring data is always available — work across all cloud platforms, with Kubernetes and more.

A good, modern example of data interoperability is the Air Force’s use of containers (orchestrated by Kubernetes) for the first time in flight. During a reconnaissance-training mission, on a Lockheed U-2 nicknamed “Dragon Lady,” data from the U2’s legacy computer systems was combined with modern containerized applications. This allowed the aircraft to harness the power of four computers to run machine-learning algorithms and utilize real-time data. This type of data interoperability is exactly what the DoD strategy is designed to foster. However, with the increase of data and its operability comes the need for greater data protection and backup.

Considerations for data protection, backup

Making data secure is one of the most important goals defined by the strategy. DoD data must remain available regardless of cyber-attacks, natural disasters and human errors. As data is recognized and utilized as a strategic asset, maintaining its availability can truly become a matter of life or death. In the event of data loss, active military members and civilians must be able to quickly recover data with zero impact or delay.

Because data modernization is an evolving process, DoD’s data protection and backup plans need to be agile and adaptable. Incorporating best practices, including protecting physical, virtual and cloud data as well as training employees on data protection will help reduce loss and downtime of mission critical data.

As DoD’s data goals move forward, data protection needs will increase exponentially. The amount of staff focused on data, however, does not reflect this; few network administrators or other IT professionals are focused exclusively on, data protection.

To supplement IT staff needs, DoD should implement automated disaster recovery applications. These applications allow a network to pre-define how it will fail over from one site to another, eliminating human error from the equation, and reducing downtime. This makes disaster recovery less complex and easier to accomplish, reducing the need for IT specialists.

Another way to reduce complexity when it comes to data protection and backup is to lessen reliance on hardware-dependent platforms. Hardware-dependent data protection platforms do not allow users to unify as many workloads as possible under one control plane and are inconsistent with a software-defined strategy that many agencies are pursuing. Overall, the DoD should reduce the number of complex tools deployed within a network.

The DoD workforce often consists of generalists with a versatile set of IT skills and roles and backup and data recovery are often a subset of several differing responsibilities. By eliminating complexity and adapting automated disaster recovery, network administrators can perform their roles better and reduce risk to mission continuity.

Modern warfare has made data an asset that adversaries target. The DoD Data Strategy has not only recognized the importance of protecting data and treating it as the strategic asset it is, but also demonstrates DoD’s commitment to sharing information across the defense community. As the defense community looks towards the new year, continuing to see the strategy through will help make the goal of becoming a data-centric agency a reality.

Earl G. Matthews is president of Veeam Government Solutions (VGS).

]]>
https://federalnewsnetwork.com/commentary/2022/06/leveraging-the-dod-data-strategy-in-2022/feed/ 0
Pentagon’s CDAO aims to scale ‘different operating model’ https://federalnewsnetwork.com/defense-main/2022/06/pentagons-cdao-aims-to-scale-different-operating-model/ https://federalnewsnetwork.com/defense-main/2022/06/pentagons-cdao-aims-to-scale-different-operating-model/#respond Wed, 08 Jun 2022 17:49:48 +0000 https://federalnewsnetwork.com/?p=4094443 var config_4105686 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/dts.podtrac.com\/redirect.mp3\/pdst.fm\/e\/chrt.fm\/track\/E2G895\/aw.noxsolutions.com\/launchpod\/federal-drive\/mp3\/061622_Justin_web_r1su_4a26113f.mp3?awCollectionId=1146&awEpisodeId=3ac25a87-42f2-4c97-9f72-89354a26113f&awNetwork=322"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2018\/12\/FD1500-150x150.jpg","title":"Pentagon\u2019s CDAO aims to scale \u2018different operating model\u2019","description":"[hbidcpodcast podcastid='4105686']nnThe Pentagon\u2019s new chief digital and artificial intelligence office is quickly bringing together multiple tech specialists under one roof in the latest bid to scale a \u201cdifferent operating model\u201d for delivering digital technologies across the Defense Department.nnThe CDAO hit full operational capability on June 1 and is hosting an online \u201cDoD Digital and AI Symposium\u201d this week. The office <a href="https:\/\/federalnewsnetwork.com\/defense-main\/2022\/02\/dod-names-cio-as-acting-official-to-deliver-end-to-end-integration-on-data-ai\/">merges<\/a> the Joint Artificial Intelligence Center, the Defense Digital Service, the chief data officer and the Advancing Analytics (Advana) platform that originated in the DoD comptroller\u2019s office.nnMargie Palmieri, the deputy chief digital and artificial intelligence officer, compared it to a major merger and acquisition activity in industry.nn\u201cCompanies go into mergers and acquisitions to be competitive,\u201d she said on day one of the symposium. \u201cAnd that's exactly what the Department of Defense is doing. We are increasing our competitive advantage by bringing these different groups together. And for the first time in my career of over 15 years in government at this point, all the right levers of change and influence are coming into play in the CDAO.\u201dnnWhile recognizing that DoD\u2019s industrial age acquisition approaches are still appropriate for the thousands of aircraft, ships and vehicles the Pentagon buys every year, Palmieri said software, digital technologies and data analytics require an alternative approach to scale across DoD. The various organizations coming together under the CDAO all have experience in piloting alternative approaches to the Pentagon\u2019s traditional development and buying processes.nn\u201cThis team coming together to show what a different operating model looks like is one of our top priorities,\u201d Palmieria said.n<h2>Who\u2019s on the CDAO team<\/h2>nIn late April, the Pentagon announced Craig Martell would be the first chief digital and AI officer. Martell was most recently head of machine learning at Lyft, previously led machine learning at Dropbox, and he led a number of AI initiatives at LinkedIn. He also was a tenured computer science professor at the Naval Postgraduate School.nn\u201cI'm doing it because of the mission,\u201d Martell said during the symposium. \u201cIt's extremely important that we get this right. And there are not a lot of folks who have the intersection of AI and a government background. And when the deputy secretary of defense calls you up and says, \u2018We would like you to take this job,\u2019 you have to think really hard about why you wouldn't take the job, and not the other way around. And I think getting this mission right is extremely important.\u201dnnAs the No. 2 at the CDAO, Palmieri has more than a decade of DoD experience, most recently as special assistant to the vice chief of naval operations. She was also founding director of the Navy Digital Warfare Office, a relatively new organization focused on data analytics and Project Overmatch, the sea service\u2019s contribution to the military\u2019s Joint All Domain Command and Control concept.nnThe CDAO also has filled out its ranks with officials from the JAIC, DDS, the CDO\u2019s office, Advana and other DoD organizations:n<ul>n \t<li>Clark Cully, previously DoD\u2019s deputy chief data officer, is now the deputy CDAO for policy, strategy and governance.<\/li>n \t<li>Sharothi Pikar is the deputy CDAO for acquisition and AI assurance. She joined the JAIC in the spring to lead acquisitions. She also previously held positions as acquisition executive at U.S. Cyber Command and as the associate director for cyber strategies in the office of the under secretary for research and engineering.<\/li>n \t<li>William Streilein, who had a long career at MIT Lincoln Labs before joining the JAIC as chief technology officer this spring, is now CTO for the CDAO.<\/li>n \t<li>Joe Larson, currently chief of Project Maven, is leaving the project to be Deputy CDAO for Algorithmic Warfare. Larson helped co-found Project Maven, the Pentagon\u2019s AI pathfinder program, which is n<a href="https:\/\/federalnewsnetwork.com\/intelligence-community\/2022\/04\/pentagon-shifting-project-maven-marquee-artificial-intelligence-initiative-to-nga\/">ow transitioning to the National Geospatial-Intelligence Agency.<\/a><\/li>n \t<li>Greg Little is the deputy CDAO for enterprise capabilities. He was previously deputy comptroller for enterprise data and business performance. Little was also program lead for Advana, a big data program that now falls under the purview of the CDAO.<\/li>n \t<li>Katie Olsen, who was the deputy director of the Defense Digital Service, is now the deputy CDAO for digital services.<\/li>n<\/ul>nThe DDS in particular excelled at bringing tech talent from outside the government into DoD for temporary tours of duty to work on high-profile problems. Recently, DDS led Project Rabbit, an effort to allow employers to verify their previous or current Afghan employees seeking asylum in the United States.nnOlsen says she think the CDAO can help scale the DDS model.nn\u201cWe've been a ragtag team of 50 to 60 people, give or take, for the past seven years,\u201d Olsen said during the symposium. \u201cI'm excited about being part of the CDAO because I think it's an opportunity to replicate and scale that talent and the [idea that] when you give smart people agency to do things, what can happen.\u201dn<h2>CDAO operating model<\/h2>nPalmieri said the groundwork laid by CDAO\u2019s predecessor organizations have helped establish good practices for how to approach digital and AI technologies within DoD.nn\u201cThe good news is everything that the department has to do to scale digital analytics and AI is relatively known at this point,\u201d she said.nnIn particular, she said determining the right \u201cfeedback loop\u201d between users and developers will be key as CDAO looks to tailor the budget, requirements and acquisition processes to its technology goals.nn\u201cWe really see the opportunity for the CDAO to put together a different operating model for how do you actually deliver these types of capabilities in a meaningful way to users and create a more rapid feedback loop where requirements, acquisition and funding are all in response to that capability need, instead of driving the pace, as happens today.\u201dnnFormer Google chief executive Eric Schmidt told Martell he should avoid a wholesale reliance on prime contracting. Schmidt has been highly influential in driving DoD\u2019s technology organizations and strategies in recent years as chairman of the Defense Innovation Board and then co-chairman of the National Security Commission on AI.nn\u201cMy guess is that at the end of the day, your success will be your own software people in the government, as well as essentially small firms where you're essentially contracting with a firm, but it's really one person. And then you'll have some priming. If you start with a full private approach, you'll never get there.\u201dnnMartell said it will likely be a combination of in-house coders and contracted work, suggesting a collaborative approach.nn\u201cWe're not going to grow the talent fast enough to have all of the coders in government to do what we need to do,\u201d he said. \u201cWe're not going to replicate these agile new AI companies. But if we have the authority to say, \u2018You're sitting with us, and you're our agile team, and you're going through the loop with us as we're building it,\u2019 I think we have a greater chance of success.\u201d"}};

The Pentagon’s new chief digital and artificial intelligence office is quickly bringing together multiple tech specialists under one roof in the latest bid to scale a “different operating model” for delivering digital technologies across the Defense Department.

The CDAO hit full operational capability on June 1 and is hosting an online “DoD Digital and AI Symposium” this week. The office merges the Joint Artificial Intelligence Center, the Defense Digital Service, the chief data officer and the Advancing Analytics (Advana) platform that originated in the DoD comptroller’s office.

Margie Palmieri, the deputy chief digital and artificial intelligence officer, compared it to a major merger and acquisition activity in industry.

“Companies go into mergers and acquisitions to be competitive,” she said on day one of the symposium. “And that’s exactly what the Department of Defense is doing. We are increasing our competitive advantage by bringing these different groups together. And for the first time in my career of over 15 years in government at this point, all the right levers of change and influence are coming into play in the CDAO.”

While recognizing that DoD’s industrial age acquisition approaches are still appropriate for the thousands of aircraft, ships and vehicles the Pentagon buys every year, Palmieri said software, digital technologies and data analytics require an alternative approach to scale across DoD. The various organizations coming together under the CDAO all have experience in piloting alternative approaches to the Pentagon’s traditional development and buying processes.

“This team coming together to show what a different operating model looks like is one of our top priorities,” Palmieria said.

Who’s on the CDAO team

In late April, the Pentagon announced Craig Martell would be the first chief digital and AI officer. Martell was most recently head of machine learning at Lyft, previously led machine learning at Dropbox, and he led a number of AI initiatives at LinkedIn. He also was a tenured computer science professor at the Naval Postgraduate School.

“I’m doing it because of the mission,” Martell said during the symposium. “It’s extremely important that we get this right. And there are not a lot of folks who have the intersection of AI and a government background. And when the deputy secretary of defense calls you up and says, ‘We would like you to take this job,’ you have to think really hard about why you wouldn’t take the job, and not the other way around. And I think getting this mission right is extremely important.”

As the No. 2 at the CDAO, Palmieri has more than a decade of DoD experience, most recently as special assistant to the vice chief of naval operations. She was also founding director of the Navy Digital Warfare Office, a relatively new organization focused on data analytics and Project Overmatch, the sea service’s contribution to the military’s Joint All Domain Command and Control concept.

The CDAO also has filled out its ranks with officials from the JAIC, DDS, the CDO’s office, Advana and other DoD organizations:

  • Clark Cully, previously DoD’s deputy chief data officer, is now the deputy CDAO for policy, strategy and governance.
  • Sharothi Pikar is the deputy CDAO for acquisition and AI assurance. She joined the JAIC in the spring to lead acquisitions. She also previously held positions as acquisition executive at U.S. Cyber Command and as the associate director for cyber strategies in the office of the under secretary for research and engineering.
  • William Streilein, who had a long career at MIT Lincoln Labs before joining the JAIC as chief technology officer this spring, is now CTO for the CDAO.
  • Joe Larson, currently chief of Project Maven, is leaving the project to be Deputy CDAO for Algorithmic Warfare. Larson helped co-found Project Maven, the Pentagon’s AI pathfinder program, which is now transitioning to the National Geospatial-Intelligence Agency.
  • Greg Little is the deputy CDAO for enterprise capabilities. He was previously deputy comptroller for enterprise data and business performance. Little was also program lead for Advana, a big data program that now falls under the purview of the CDAO.
  • Katie Olsen, who was the deputy director of the Defense Digital Service, is now the deputy CDAO for digital services.

The DDS in particular excelled at bringing tech talent from outside the government into DoD for temporary tours of duty to work on high-profile problems. Recently, DDS led Project Rabbit, an effort to allow employers to verify their previous or current Afghan employees seeking asylum in the United States.

Olsen says she think the CDAO can help scale the DDS model.

“We’ve been a ragtag team of 50 to 60 people, give or take, for the past seven years,” Olsen said during the symposium. “I’m excited about being part of the CDAO because I think it’s an opportunity to replicate and scale that talent and the [idea that] when you give smart people agency to do things, what can happen.”

CDAO operating model

Palmieri said the groundwork laid by CDAO’s predecessor organizations have helped establish good practices for how to approach digital and AI technologies within DoD.

“The good news is everything that the department has to do to scale digital analytics and AI is relatively known at this point,” she said.

In particular, she said determining the right “feedback loop” between users and developers will be key as CDAO looks to tailor the budget, requirements and acquisition processes to its technology goals.

“We really see the opportunity for the CDAO to put together a different operating model for how do you actually deliver these types of capabilities in a meaningful way to users and create a more rapid feedback loop where requirements, acquisition and funding are all in response to that capability need, instead of driving the pace, as happens today.”

Former Google chief executive Eric Schmidt told Martell he should avoid a wholesale reliance on prime contracting. Schmidt has been highly influential in driving DoD’s technology organizations and strategies in recent years as chairman of the Defense Innovation Board and then co-chairman of the National Security Commission on AI.

“My guess is that at the end of the day, your success will be your own software people in the government, as well as essentially small firms where you’re essentially contracting with a firm, but it’s really one person. And then you’ll have some priming. If you start with a full private approach, you’ll never get there.”

Martell said it will likely be a combination of in-house coders and contracted work, suggesting a collaborative approach.

“We’re not going to grow the talent fast enough to have all of the coders in government to do what we need to do,” he said. “We’re not going to replicate these agile new AI companies. But if we have the authority to say, ‘You’re sitting with us, and you’re our agile team, and you’re going through the loop with us as we’re building it,’ I think we have a greater chance of success.”

]]>
https://federalnewsnetwork.com/defense-main/2022/06/pentagons-cdao-aims-to-scale-different-operating-model/feed/ 0
VA health data gaps after EHR rollout put hospital accreditation at risk, IG warns https://federalnewsnetwork.com/veterans-affairs/2022/06/va-health-data-gaps-after-ehr-rollout-put-hospital-accreditation-at-risk-ig-warns/ https://federalnewsnetwork.com/veterans-affairs/2022/06/va-health-data-gaps-after-ehr-rollout-put-hospital-accreditation-at-risk-ig-warns/#respond Wed, 01 Jun 2022 20:20:48 +0000 https://federalnewsnetwork.com/?p=4084591 The Department of Veterans Affairs’ first medical center to launch its new Electronic Health Record (EHR) is running into data quality challenges so severe that its inspector general’s office is concerned whether the facility can maintain its hospital accreditation.

The IG’s office, in a report Wednesday, found the Mann-Grandstaff Medical Center in Spokane, Washington still lacked critical quality and patient safety metrics a year after the EHR go-live.

The new EHR went live at Mann-Grandstaff VAMC in October 2020. The facility served over 35,000 patients in fiscal 2021.

The IG’s office found that the facility staff’s lack of access to critical metrics hurt the facility’s continuous readiness for re-accreditation, “which may compromise the facility’s future hospital accreditation status.”

“The OIG remains concerned that deficits in the new EHR metrics may negatively affect organizational performance, quality and patient safety, and access to care,” the report states.

The report states that the facility losing its accreditation would hurt patients’ trust in the facility and make it harder to recruit quality employees.

House lawmakers have raised concerns about the new EHR’s reliability, and have called on the VA to pause future rollouts of the system until it conducts a comprehensive review of problems.

The Senate last week passed the Electronic Health Record Transparency Act, which requires the VA to submit periodic reports to Congress on the costs, performance metrics, and outcomes for EHR modernization. The bill now heads to President Joe Biden’s desk.

VA officials told the Spokesman-Review last month that the EHR has experienced 42 “unplanned degradations” and eight “unplanned outages” between its launch in October 2020 and April 20, 2022.

The IG’s office released three reports in March that found the new EHR sometimes failed to indicate to providers that patients were flagged as being at high risk of suicide and gave VA providers an incomplete picture of patients’ health care data.

VA Secretary Denis McDonough has repeatedly told lawmakers and reporters that he’s concerned by issues in the EHR rollout to date, but said the VA will proceed with EHR go-lives at additional facilities.

The VA announced in July 2021 that it would pause future rollouts after a strategic review found a wide array of problems at the first go-live site in Spokane.

The EHR went live at a VA facility in Walla Walla, Washington this March, and in Columbus, Ohio in April.

Deputy VA Secretary Donald Remy agreed with the report’s recommendations but told the IG’s office the VA anticipated that it would take time to fully utilize its new EHR’s analytics capabilities.

Remy added that the data between the agency’s legacy VistA EHR and the new Cerner EHR “would not be directly comparable,” particularly for appointment scheduling.

“The transition is very difficult, but the Cerner system will benefit VA by providing better standardization, more real-time front-line analytics, a common system with DoD and other health systems and alignment with health care industry best practices,” Remy wrote.

The Cerner EHR is running at more than half of the Defense Department and Coast Guard’s health care facilities, as well as 27,000 private provider facilities and more than 5,900 hospitals globally.

Remy said the transition underscored the challenge of implementing a new EHR in the largest integrated health care system in the U.S. The VA’s enterprise data encompasses over 2 trillion rows of data from 13 source systems.

However, the IG’s office remains concerned that EHR problems found in Spokane may create additional problems as the EHR goes live at larger, more complex VA facilities.

“The OIG is concerned that further deployment of the new EHR in VHA without addressing the gap in metrics available to the facility will affect the facility and future sites’ ability to utilize metrics effectively,” the report states.

‘Every week those workflows are changing’

The IG’s office found the Spokane facility, one year after the EHR go-live, had only six metrics partially available, out of 17 metrics necessary for accreditation.

A VHA leader told the IG’s office that the facility was “absolutely not” ready for an upcoming accreditation survey through The Joint Commission, an independent, not-for-profit organization that certifies nearly 21,000 health care organizations and programs in the U.S.

“Every week those workflows are changing, meaning the way they do work, what they enter is changing every week. It’s hard to keep up,” the leader said.

The IG office raised concerns that missing clinical metrics,  including patient safety and quality of care metrics, “may not allow for accurate and timely patient safety monitoring,” and could delay opportunities to improve service.

The report also found that a lack of publicly reported quality metrics made it harder for veterans to make informed choices about VA care and how the quality of VA care compares to private sector providers.

VA is mandated under the MISSION Act to measure, track, and publish quality and patient safety metrics at VA facilities. VA must ensure that data reported to the public are “clear, useful, and timely” so that patients are able to make “informed decisions regarding their health care.”

‘Many hours have been added to workload’

The IG’s office found that following the EHR go-live, facility staff made created workarounds to mitigate post-go-live gaps in metrics.

“By having to audit every patient admitted during a time frame to see if they are applicable to my data needs, many hours have been added to workload,” one facility employee told the IG’s office.

Facility staff told the IG’s office that the workarounds created a “tremendous” increase in additional workload, at times requiring several hours or days just to prepare one metrics report.

“At times I have worked weekends and nights until 10 p.m., 12 midnight, or one time I didn’t even go to sleep to provide a needed report by the next morning for our leadership or for a suspense [due date],” an employee told the IG’s office.

Following the EHR go-live, one employee said responses to data requests took about eight people spending a combination of 24 hours a week to complete

“Now we have it down to about 6-8 a week,” the employee told auditors.

Staff also relied on workarounds to assess wait times for new and returning patients seeking care. A facility leader told the IG’s office that those workarounds produced approximate results, but were not “the exact metrics required by VA.”

The IG notes the VHA’s “history of deficient scheduling,” as a major reason why the agency needs accurate metrics on patient wait times.

IG’s office found in 2014 that 1,700 veterans in need of care were “at risk of being lost or forgotten” after being kept off the official waiting list at a VA hospital in Phoenix, according to the Associated Press. The resulting controversy led to the resignation of VA Secretary Eric Shinseki.

The IG’s office found that the inability of VHA staff in Spokane to track the availability of care and wait times “impedes the ability to prevent delays in care and could lead to patient harm at the facility.”

The report found these metrics made it impossible for veterans to compare wait times at other VA facilities or choose care at facilities with shorter average wait times.

A facility leader explained that the “inability to measure what we do accurately has a negative outcome to metrics, but not necessarily to care.”

A Veterans Integrated Services Network (VISN) leader told the IG’s office that any data or reports that required manual validation increased the risk of human error.

“Because we have been very diligent about both creating the reports gradually over time and having our chiefs provide their best estimates, I don’t believe this has been a direct patient safety issue. But it clearly is an efficiency issue and ultimately accurate data is needed to make [the] best decision,” the VISN leader said.

‘Everything is different’

VHA employees told the IG’s office that these workarounds stemmed from the new EHR producing a much larger volume of data, but labeled in ways unfamiliar to staff.

A VHA leader told the IG’s office that 10 days after the new EHR go-live, the agency received “roughly eight times the amount of data out of Cerner than we have ever gotten out of VistA,” the VA’s legacy EHR system.

The VHA leader said the new EHR produced data fields that were named differently than what employees were used to under the old system, and “we have never seen the data before.”

“Everything is different,” the leader explained.

A staff member told the IG’s office that facilities using the legacy EHR have one dashboard. However, the employee told the auditors that the new EHR requires facility staff to pull data from five different reports, export the data to a software program, manipulate the data and then review the data for errors.

The report finds that a lack of data definitions in the new EHR reports made it difficult for facility staff to explain what metrics meant to leadership, and made it harder for staff to export and use data.

“The problem with exporting any data is that we do not have data definitions to determine where the data comes from or what it really means. By making assumptions, we could easily make a huge error,” a facility leader told the IG office.

Another facility leader told the IG’s office data from the new EHR was frequently unusable, unless data were exported from the new EHR and then manipulated with other software tools.

Exporting data created additional challenges. The facility leader explained that following the export of data, the data had to be reviewed for strange outliers, such as duplications, test data and data in the wrong columns.

“This happens regularly, so it can never be trusted. If you forget even one filter or service from the manual clicking of each clinic, you must start over,” the facility leader said.

]]>
https://federalnewsnetwork.com/veterans-affairs/2022/06/va-health-data-gaps-after-ehr-rollout-put-hospital-accreditation-at-risk-ig-warns/feed/ 0
An NIH technology executive moves from the health field to banking and finance https://federalnewsnetwork.com/technology-main/2022/05/an-nih-technology-executive-moves-from-the-health-field-to-banking-and-finance/ https://federalnewsnetwork.com/technology-main/2022/05/an-nih-technology-executive-moves-from-the-health-field-to-banking-and-finance/#respond Fri, 27 May 2022 19:19:51 +0000 https://federalnewsnetwork.com/?p=4079225 var config_4078877 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/dts.podtrac.com\/redirect.mp3\/pdst.fm\/e\/chrt.fm\/track\/E2G895\/aw.noxsolutions.com\/launchpod\/federal-drive\/mp3\/052722_Alboum_web_gda6_cc9e2a36.mp3?awCollectionId=1146&awEpisodeId=ee7f264a-7ed3-471d-a29f-e5aecc9e2a36&adwNewID3=true&awNetwork=322"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2018\/12\/FD1500-150x150.jpg","title":"An NIH technology executive moves from the health field to banking and finance","description":"[hbidcpodcast podcastid='4078877']nn<em>Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive\u2019s daily audio interviews on\u00a0<\/em><a href="https:\/\/itunes.apple.com\/us\/podcast\/federal-drive-with-tom-temin\/id1270799277?mt=2"><i>Apple Podcasts<\/i><\/a><em>\u00a0or\u00a0<a href="https:\/\/www.podcastone.com\/federal-drive-with-tom-temin?pid=1753589">PodcastOne<\/a>.<\/em>nnAfter a long stint as the deputy director of the center for information technology at the NIH, Stacie Alboum is about to head to the Federal Deposit Insurance Corporation. At the ACT-IAC emerging technology conference in Cambridge, Maryland earlier this week, <a href="https:\/\/federalnewsnetwork.com\/category\/temin\/tom-temin-federal-drive\/"><em><strong>Federal Drive with Tom Temin<\/strong><\/em><\/a> caught up with Alboum to discuss what her new job will be all about.n<blockquote><em>Interview transcript:<\/em>nn<strong>Stacie Alboum: <\/strong>I'll be joining the CIO organization as the Deputy Director for enterprise strategy.nn<strong>Tom Temin: <\/strong>What does that mean, enterprise strategy? Enterprise IT strategy? Or what should the FDIC do for a living?nn<strong>Stacie Alboum: <\/strong>So the FDIC has quite an impressive IT operation over there. And I'm going over there, one of the big things I'll be doing is leading the development of the next IT strategic plan. And working with the IT leadership team there under Sylvia Burns' leadership, of course, to modernize the IT infrastructure and architecture. And what I'm most excited about, the role that I'll get to play in this new position is engaging and partnering with the programs across the agency. So getting to learn about their business, and more about the mission, and ensuring that IT is being responsive to their needs and fulfilling them. And obviously, I've been very fortunate to support so many incredible missions across the federal government over the last two decades. This is my fifth agency, and I'm going from spending the last 15 years supporting science missions. So protecting people's health, to going to more of the financial industry. But personally, maintaining the stability in the public confidence in the nation's financial system is a pretty important job. And I'm just really honored to join the FDIC family be a part of it.nn<strong>Tom Temin: <\/strong>And you mentioned unifying the missions there through the network. And you did that at NIH, to some extent, all of the different components of NIH, I think there's what a couple of dozen different components within NIH, the different Institutes and there was a construction, a deliberate way of tying them together with a single network. Maybe review that for us, where does that all stand now? Is it done? Or is it still evolving?nn<strong>Stacie Alboum: <\/strong>The network is one of the most critical set of capabilities that we've been delivering at NIH's center for IT. As you've heard, we're so dependent on that with the massive amounts of data that we generate. And so when I joined NIH in 2017, we were in the midst of modernizing our network. And so we completed that around, let's say, 2018, 2019, and stabilized. And believe it or not, we're now embarking on the next generation of our network capabilities. And so one of the things I love about NIH is it's really a community. So although the way it's organized and operates is highly federated, we really come together as a community. So we're currently partnering with the other institutes and centers and with industry to formulate our vision for the future of the network.nn<strong>Tom Temin: <\/strong>And what did you learn from that effort that might carry you into FDIC?nn<strong>Stacie Alboum: <\/strong>Well, something that's quite pertinent and will play into my role, which is the importance of understanding the business, bridging technology to the business, spending time establishing those partnerships with people all over the agency, and really collaborating them and co-creating solutions with them. Embedding IT into the business. And so I learned a lot about that at NIH. And I'm excited to do the same at FDIC.nn<strong>Tom Temin: <\/strong>And the purpose of a network, of course, is to move data.nn<strong>Stacie Alboum: <\/strong>Six petabytes at our agency.nn<strong>Tom Temin: <\/strong>Right. And that's something that couldn't have been imagined 20 years ago, or 40 years ago, when this whole IT thing started in the government. So what are your data learning is what do people need to know about data in this age, when you move six petabytes a day?nn<strong>Stacie Alboum: <\/strong>Well, it's interesting, because we've always done a lot in the labs and what our network has really done is interconnected all of those facilities and given us the ability to share datas and store and manage it in ways we never could before. And the network is certainly at the core of that. But what we've been doing recently through our strides program, is moving large research datasets out to our cloud platforms. And I don't know that we could have imagined that even five years ago. And so I think the biggest learning when it comes to data is just to is that the landscape is always shifting datas always going to be at the heart of what we do and all of our programs, but the world around us is changing and we have to constantly be, you know, kind of watching where the world is going and shifting with it and not be afraid to take the next leap.nn<strong>Tom Temin: <\/strong>And you have been kind of a pioneer in or at least a advancer of the idea of what you call mission excursions. Tell us what Those were at NIH, and is it something you plan to do at FDIC?nn<strong>Stacie Alboum: <\/strong>Yeah. And I imagine Sylvia has probably already got a head start on that at FDIC. So I'm excited to go there. She's already been leading the IT organization for several years. But when I came to NIH, you know, we have an incredible mission, and really talented, dedicated workforce, I call those things together, kind of the secret sauce of any incredible work experience. But what I noticed is that some of our we have a very diverse IT workforce. And some of our newer employees hadn't had a chance to really engage with the broader NIH community, and learn about the broader mission. At the same time, we had many talented individuals that had worked at NIH for years, decades even, and maybe had at some point become, you know, disengaged or more removed from that. So it was part of our program, most people would call it employee engagement, I call it improving the employees experience. Because our employees are engaged. And it's our responsibility as leaders to improve their experience. So we started a major, a cornerstone of this program was mission excursions, where we organized groups of staff to go out into the labs, into the programs, instead of them having to come to us and understand what a difference it makes day in day out at NIH.nn<strong>Tom Temin: <\/strong>And you have some really good examples that seared that idea into their brains, such as the fact that a lack of good Wi-Fi coverage, yes, can affect the outcome of a clinical trial.nn<strong>Stacie Alboum: <\/strong>Exactly, or even cellular. Before we modernized the network, which included our wireless and our cellular coverage, we were lacking in those capabilities. And one of the most special places at NIH is our clinical center where, you know, people come where they have nowhere else to go. It's like our former director, Dr. Colin says, you know, he thinks of us as the national institute of hope, institutes of hope. And so that was one place where I partnered with some of my colleagues at the clinical center. And they were gracious enough to take us on tours where we could see firsthand the patient care that was happening in the clinics that we run.nn<strong>Tom Temin: <\/strong>Right, and there were people doing work-arounds as a result of not having Wi-Fi. And that's drove home, the idea that even something as mundane as Wi-Fi coverage matters.nn<strong>Stacie Alboum: <\/strong>Exactly. Matters a lot.nn<strong>Tom Temin:\u00a0 I<\/strong>n some cases, those basics matter more than the next artificial intelligence project.nn<strong>Stacie Alboum: <\/strong>And when you when you as a technologist see that firsthand whether you're an engineer, or an operator or developer, and stop and think for a moment, that could be my family member that's cared for there. That could be my child, my brother, my sister, a neighbor, a friend, that's, you know, has come here for help and hope, and wouldn't you want them to have all the possibilities that NIH has to offer?nn<strong>Tom Temin: <\/strong>And finally, what's the next frontier for customer experience as it relates to the federal employee?nn<strong>Stacie Alboum: <\/strong>Yes. So customer experience is, of course, all the rage right now, which I think you well know, Tom. And one of the philosophies that I believe in is that customer experience really begins with employee experience. And you have to give your people, the ones that are getting the work done every day in your organizations, a positive experience that will, you know, really trickle over into the rest of the agencies, the customers you serve, both internally and externally. And so we put a lot of focus in our organization, we have a set of operating principles to empower people. enable innovation and deliver value. A set of core values that we live and breathe by every day. And the underpinning, those are diversity, equity and inclusion, which I believe fuel innovation and breed excellence in any organization. And that was one of the things that really attracted me to FDIC is when I was preparing to pursue this opportunity, I read the FDIC strategic plan, the CIO organization's IT strategic plan, and their core values very much resonated with me.nn<strong>Tom Temin: <\/strong>Stacie Alboum is the new deputy director for enterprise strategy in the office of the CIO at the Federal Deposit Insurance Corporation.<\/blockquote>"}};

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

After a long stint as the deputy director of the center for information technology at the NIH, Stacie Alboum is about to head to the Federal Deposit Insurance Corporation. At the ACT-IAC emerging technology conference in Cambridge, Maryland earlier this week, Federal Drive with Tom Temin caught up with Alboum to discuss what her new job will be all about.

Interview transcript:

Stacie Alboum: I’ll be joining the CIO organization as the Deputy Director for enterprise strategy.

Tom Temin: What does that mean, enterprise strategy? Enterprise IT strategy? Or what should the FDIC do for a living?

Stacie Alboum: So the FDIC has quite an impressive IT operation over there. And I’m going over there, one of the big things I’ll be doing is leading the development of the next IT strategic plan. And working with the IT leadership team there under Sylvia Burns’ leadership, of course, to modernize the IT infrastructure and architecture. And what I’m most excited about, the role that I’ll get to play in this new position is engaging and partnering with the programs across the agency. So getting to learn about their business, and more about the mission, and ensuring that IT is being responsive to their needs and fulfilling them. And obviously, I’ve been very fortunate to support so many incredible missions across the federal government over the last two decades. This is my fifth agency, and I’m going from spending the last 15 years supporting science missions. So protecting people’s health, to going to more of the financial industry. But personally, maintaining the stability in the public confidence in the nation’s financial system is a pretty important job. And I’m just really honored to join the FDIC family be a part of it.

Tom Temin: And you mentioned unifying the missions there through the network. And you did that at NIH, to some extent, all of the different components of NIH, I think there’s what a couple of dozen different components within NIH, the different Institutes and there was a construction, a deliberate way of tying them together with a single network. Maybe review that for us, where does that all stand now? Is it done? Or is it still evolving?

Stacie Alboum: The network is one of the most critical set of capabilities that we’ve been delivering at NIH’s center for IT. As you’ve heard, we’re so dependent on that with the massive amounts of data that we generate. And so when I joined NIH in 2017, we were in the midst of modernizing our network. And so we completed that around, let’s say, 2018, 2019, and stabilized. And believe it or not, we’re now embarking on the next generation of our network capabilities. And so one of the things I love about NIH is it’s really a community. So although the way it’s organized and operates is highly federated, we really come together as a community. So we’re currently partnering with the other institutes and centers and with industry to formulate our vision for the future of the network.

Tom Temin: And what did you learn from that effort that might carry you into FDIC?

Stacie Alboum: Well, something that’s quite pertinent and will play into my role, which is the importance of understanding the business, bridging technology to the business, spending time establishing those partnerships with people all over the agency, and really collaborating them and co-creating solutions with them. Embedding IT into the business. And so I learned a lot about that at NIH. And I’m excited to do the same at FDIC.

Tom Temin: And the purpose of a network, of course, is to move data.

Stacie Alboum: Six petabytes at our agency.

Tom Temin: Right. And that’s something that couldn’t have been imagined 20 years ago, or 40 years ago, when this whole IT thing started in the government. So what are your data learning is what do people need to know about data in this age, when you move six petabytes a day?

Stacie Alboum: Well, it’s interesting, because we’ve always done a lot in the labs and what our network has really done is interconnected all of those facilities and given us the ability to share datas and store and manage it in ways we never could before. And the network is certainly at the core of that. But what we’ve been doing recently through our strides program, is moving large research datasets out to our cloud platforms. And I don’t know that we could have imagined that even five years ago. And so I think the biggest learning when it comes to data is just to is that the landscape is always shifting datas always going to be at the heart of what we do and all of our programs, but the world around us is changing and we have to constantly be, you know, kind of watching where the world is going and shifting with it and not be afraid to take the next leap.

Tom Temin: And you have been kind of a pioneer in or at least a advancer of the idea of what you call mission excursions. Tell us what Those were at NIH, and is it something you plan to do at FDIC?

Stacie Alboum: Yeah. And I imagine Sylvia has probably already got a head start on that at FDIC. So I’m excited to go there. She’s already been leading the IT organization for several years. But when I came to NIH, you know, we have an incredible mission, and really talented, dedicated workforce, I call those things together, kind of the secret sauce of any incredible work experience. But what I noticed is that some of our we have a very diverse IT workforce. And some of our newer employees hadn’t had a chance to really engage with the broader NIH community, and learn about the broader mission. At the same time, we had many talented individuals that had worked at NIH for years, decades even, and maybe had at some point become, you know, disengaged or more removed from that. So it was part of our program, most people would call it employee engagement, I call it improving the employees experience. Because our employees are engaged. And it’s our responsibility as leaders to improve their experience. So we started a major, a cornerstone of this program was mission excursions, where we organized groups of staff to go out into the labs, into the programs, instead of them having to come to us and understand what a difference it makes day in day out at NIH.

Tom Temin: And you have some really good examples that seared that idea into their brains, such as the fact that a lack of good Wi-Fi coverage, yes, can affect the outcome of a clinical trial.

Stacie Alboum: Exactly, or even cellular. Before we modernized the network, which included our wireless and our cellular coverage, we were lacking in those capabilities. And one of the most special places at NIH is our clinical center where, you know, people come where they have nowhere else to go. It’s like our former director, Dr. Colin says, you know, he thinks of us as the national institute of hope, institutes of hope. And so that was one place where I partnered with some of my colleagues at the clinical center. And they were gracious enough to take us on tours where we could see firsthand the patient care that was happening in the clinics that we run.

Tom Temin: Right, and there were people doing work-arounds as a result of not having Wi-Fi. And that’s drove home, the idea that even something as mundane as Wi-Fi coverage matters.

Stacie Alboum: Exactly. Matters a lot.

Tom Temin:  In some cases, those basics matter more than the next artificial intelligence project.

Stacie Alboum: And when you when you as a technologist see that firsthand whether you’re an engineer, or an operator or developer, and stop and think for a moment, that could be my family member that’s cared for there. That could be my child, my brother, my sister, a neighbor, a friend, that’s, you know, has come here for help and hope, and wouldn’t you want them to have all the possibilities that NIH has to offer?

Tom Temin: And finally, what’s the next frontier for customer experience as it relates to the federal employee?

Stacie Alboum: Yes. So customer experience is, of course, all the rage right now, which I think you well know, Tom. And one of the philosophies that I believe in is that customer experience really begins with employee experience. And you have to give your people, the ones that are getting the work done every day in your organizations, a positive experience that will, you know, really trickle over into the rest of the agencies, the customers you serve, both internally and externally. And so we put a lot of focus in our organization, we have a set of operating principles to empower people. enable innovation and deliver value. A set of core values that we live and breathe by every day. And the underpinning, those are diversity, equity and inclusion, which I believe fuel innovation and breed excellence in any organization. And that was one of the things that really attracted me to FDIC is when I was preparing to pursue this opportunity, I read the FDIC strategic plan, the CIO organization’s IT strategic plan, and their core values very much resonated with me.

Tom Temin: Stacie Alboum is the new deputy director for enterprise strategy in the office of the CIO at the Federal Deposit Insurance Corporation.

]]>
https://federalnewsnetwork.com/technology-main/2022/05/an-nih-technology-executive-moves-from-the-health-field-to-banking-and-finance/feed/ 0
Spy agencies look to standardize use of open source intelligence https://federalnewsnetwork.com/intelligence-community/2022/05/spy-agencies-look-to-standardize-use-of-open-source-intelligence/ https://federalnewsnetwork.com/intelligence-community/2022/05/spy-agencies-look-to-standardize-use-of-open-source-intelligence/#respond Thu, 12 May 2022 20:26:31 +0000 https://federalnewsnetwork.com/?p=4055968 Intelligence agencies are starting to coalesce around a set of common standards and data for using open source intelligence, but challenges remain in boosting the use of OSINT throughout the intelligence community.

Patrice Tibbs, chief of community open source at the CIA, said open source has “proven itself over and over,” especially given current events like Russia’s invasion of Ukraine. OSINT is generally defined as unclassified information, often publicly available, like data gleaned from social media feeds.

“My five-year-old grandson understands the value of the iPhone and in communicating,” Tibbs said during a May 2 event hosted by the Intelligence and National Security Alliance. “If we can’t get on board and figure that kind of thing out now, and understand how that can be leveraged to make sure that we are clear in every country, every city, every home, in some cases, we will lose the lion’s share of any benefit we have in open source.”

Spy agencies have traditionally been organized around other forms of intelligence, like geospatial intelligence or signals intelligence. Agencies have struggled to define how OSINT fits into its broader tradecraft, but the array of public information about the Ukraine conflict has started to shift the conversation about OSINT intelligence circles.

The 2022 Intelligence Authorization Act is also pushing agencies to build more OSINT capabilities within the context of competition with China.

“The Intelligence Community must reorient to engage in a strategic competition with the PRC while countering China’s malign activities globally,” the Senate’s report on the Intelligence Authorization Act states. “To do so, it must continue to build open source intelligence capabilities and augment capacity; enhance sharing of intelligence capabilities; and strengthen the analytical and collection capabilities relating to non- military threats including technology competition.”

As head of the CIA, Director Bill Burns is considered the “community functional manager for open source,” meaning he reports to Director of National Intelligence Avril Haines on OSINT policy, requirements and funding, according to Tibbs.

The intelligence community also has a “National Open Source Committee,” which includes senior leaders from each of the 18 intelligence agencies. Within the committee, there are subcommittees specifically focused on issues like OSINT data, collection management, training and tradecraft, according Tibbs.

She said senior leaders are starting to take OSINT more seriously, providing a chance to set common standards around open source training and tradecraft

“The key for me is just understanding how we modify and change and adapt to the amount of data that’s available,” Tibbs said. “And because there’s not a consistency of how all of the different 18 organizations are utilizing, are capturing or are integrating open source into their workflows, there is inconsistency sometimes in how that is translated and shared and a variety of other things.”

Tibbs said the most difficult challenge for OSINT is “getting everybody to agree on the direction.” She also said ensuring the intelligence community has the technology infrastructure to support OSINT is important.

“Also, it’s just having the individuals with the right skill set and the motivation to come in and really take on these challenging roles, especially in the federal government realm,” Tibbs added. “This is this is not always the easiest place to work. It is not always the highest paid place to work.”

DIA open source center

The Defense Intelligence Agency is also looking to elevate OSINT through its Open Source Intelligence Integration Center. It was established in late 2019, and governs the military’s use of OSINT by leaning on standards, processes and tradecraft, according to Brad Ahlskog, chief of the center.

Compared to traditional forms of intelligence, Ahklskog said OSINT is now playing an “outsized role” in scenarios like exposing Russia’s build-up of military forces and subsequent invasion of Ukraine.

“I would argue that along with the amplified information warfare aspect of hybrid warfare, there’s a vital need for what I’ll call ‘hybrid intelligence,’” Ahlskog said during an April 26 presentation at the GEOINT conference in Denver. “It features a larger reliance on OSINT to both identify the threats, support deterrence operations, and provide the ground truth of the battlefield situation during the conflict.”

But he said the Defense Department will rely on contractors to help with a “data-centric approach” to OSINT. The approach will rely on artificial intelligence and automation to speed up the exploitation of such data, but it doesn’t come without some caveats.

“This translates to a need for datasets that are cataloged discoverable, or resident in government or commercial applications that are readily machine ingestible through application interfaces for transfer into other applications in formats that are not proprietary,” Ahlskog said. “Too often in the past, some open tools or applications were created with self contained datasets that were not easily blended with other data, or datasets from other sources that cannot be easily moved into a proprietary application.”

Furthermore, he said it will be crucial for analysts to understand how algorithms are analyzing the data and reaching specific conclusions.

“We want to produce actionable intelligence from them, and we must be able to easily understand and clearly explain how application ‘X’ produced information ‘Y’ that was used to inform decisions ‘Z,’” he said. “Without transparency into how OSINT data is obtained and processed, the intelligence community personnel will be very reluctant to rely on that PAI for intelligence purposes. Verification, validation and sourcing play integral roles for data in these cases.”

Beyond technology, Ahlskog said he thinks the culture around relying on OSINT is starting to change.

“I think more and more of our customers and personnel realize the value of open source,” he said. “They also get it first, frankly, in many cases. It’s very immediate. . . . It wasn’t possible 10 years ago, or 15 years ago, for information to be that widespread, immediately available to anyone who has a mobile device, or computer at their desk. So I think a lot of our customers, and our all source analysts and other collectors are getting much more comfortable with relying on open source early and often.”

]]>
https://federalnewsnetwork.com/intelligence-community/2022/05/spy-agencies-look-to-standardize-use-of-open-source-intelligence/feed/ 0
Federal health tech leaders want to extract data for greater equity https://federalnewsnetwork.com/technology-main/2022/05/federal-health-tech-leaders-want-to-extract-data-for-greater-equity/ https://federalnewsnetwork.com/technology-main/2022/05/federal-health-tech-leaders-want-to-extract-data-for-greater-equity/#respond Tue, 10 May 2022 19:00:40 +0000 https://federalnewsnetwork.com/?p=4051684 Designing technology infrastructure with its target communities in mind increases the chances that those groups adopt it. That, in turn, fosters greater equity in health care systems — a recurring message at last week’s Health Innovation Summit 22 from ACT-IAC.

The Biden administration has repeatedly called for equity considerations to be incorporated into new policies or programs, and more federal leaders are grasping what that means for their particular agencies.

For the Center for Medicare and Medicaid Innovation at the Department of Health and Human Services, equity is “the attainment of the highest level of health for all people,” according to its Healthy People 2030 strategic refresh. One of the refresh’s metrics for success is stronger data collection and intersectional analyses for populations defined by race, ethnicity, language, geography and disability, in order to identify gaps in care and develop interventions to address them. To do that, CMMI said it would require participants in all new models to collect and report data to identify and monitor impacts on health and the reduction of disparities, while existing models are incentivized to do the same.

All new models will also include patients from historically underserved populations and safety net providers, such as community health centers and disproportionate share hospitals — facilities that serve a significantly disproportionate number of low-income patients and receive payments from the Centers for Medicaid and Medicare Services to cover care to uninsured patients.

CMMI Deputy Director Arrah Tabe-Bedward said we’ve spent a lot of time really just trying to understand and start to collect data to understand what is the Center’s reach and how it can get more providers to participate.

“We think that there are incredible opportunities to do that in advanced primary care models and [accountable care organization] models. That sort of a structure, of course, requires that there is ample opportunity for information to be exchanged efficiently and effectively across providers and across care settings, in order to optimize the patient experience,” she said. “And so being able to get that right, as we are driving towards the very ambitious goal that we’ve set out for ourselves for 2030, to ensure that all of our Medicare beneficiaries, and the vast majority of those who are in Medicaid, are in those sorts of aligned care relationships with providers.”

Tabe-Bedward said CMMI wants to make sure there is technology to support those relationships and to ensure that the care being optimized.

Some communities lack experience with AI and machine learning, or actively distrust the technology, thus hampering its implementation. Susan Gregurick, associate director for Data Science at the National Institutes of Health, said this very problem was observed by NIH’s Artificial Intelligence/Machine Learning Consortium to Advance Health Equity and Researcher Diversity (AIM-AHEAD) Program.

“In this case, federated learning is the right approach. And I think that’s one of my messages is that there’s no one hammer for all the many use cases … we really have to adopt and adapt our technologies for the communities and the research programs that we really want to address,” she said.

As the White House Presidential Innovation fellow at the Technology Transformation Services, Nina Walia is passionate about accessible data. For health care data in particular, the troves of PDFs and documents used by providers keep valuable information trapped, leading to a lot of redundant data entry.

“If we started to just mass and bulk and all adopting the idea of optical character recognition compared with computer vision, we could start to actually extract this data in an automated process so that this data can be machine readable,” she said.

]]>
https://federalnewsnetwork.com/technology-main/2022/05/federal-health-tech-leaders-want-to-extract-data-for-greater-equity/feed/ 0
Tabletop exercises to put CMMC 2.0 through the paces https://federalnewsnetwork.com/cybersecurity/2022/05/tabletop-exercises-to-put-cmmc-2-0-through-the-paces/ https://federalnewsnetwork.com/cybersecurity/2022/05/tabletop-exercises-to-put-cmmc-2-0-through-the-paces/#respond Mon, 09 May 2022 17:28:24 +0000 https://federalnewsnetwork.com/?p=4049467 var config_4049247 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/dts.podtrac.com\/redirect.mp3\/pdst.fm\/e\/chrt.fm\/track\/E2G895\/aw.noxsolutions.com\/launchpod\/federal-drive\/mp3\/050922_Jason_web_n6rk_ceb49d86.mp3?awCollectionId=1146&awEpisodeId=5ed7f3e1-1e9d-41f4-95a1-2f31ceb49d86&awNetwork=322"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2018\/12\/FD1500-150x150.jpg","title":"DoD’s new version of the Cybersecurity Maturity Model Certification is ready for launch","description":"[hbidcpodcast podcastid='4049247']nnKeep an eye on the Defense Department this summer to see just how well designers of version 2.0 of the Cybersecurity Maturity Model Certification (CMMC) did.nnThe first real test of the revamped effort comes in the form of tabletop exercises in mid-to-late June or early July.nn[caption id="attachment_2731596" align="alignright" width="450"]<img class="wp-image-2731596" src="https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2020\/02\/Stacy-Bostjanick-3-300x156.jpg" alt="" width="450" height="234" \/> Stacy Bostjanick is the chief of implementation and policy in the DoD\u2019s Office of the DoD Chief Information Officer.[\/caption]nn\u201cWe're going to be doing some tabletop exercises where we actually fabricate a program and walk the dog, making sure we look at a proposal, that we're looking for the right information and we're going to have members from the Defense Industrial Base (DIB) sector participate with us,\u201d said Stacy Bostjanick, the chief of implementation and policy in the DoD\u2019s Office of the DoD Chief Information Officer, at the AFCEA NOVA Small Business IT Day event on May 5. \u201cWe want to hear from your perspective, whether that's a bridge too far or that's too hard for us. We need to read wicker it a different way to make sure that what we put together is an executable program that people can participate in and understand from the get go what your requirements are and how you have to manage and handle things.\u201dnnDoD <a href="https:\/\/federalnewsnetwork.com\/defense-main\/2021\/11\/pentagon-strips-down-cmmc-program-to-streamline-industry-cyber-assessments\/">revamped its approach<\/a> to securing controlled unclassified information (CUI) in November and is going through the federal rulemaking process. These tabletop exercises, Bostjanick said, are an important step toward DoD\u2019s goal of launching CMMC in 2023.nnOne big area of interest for DoD is the use of CUI. Bostjanick said contractors who hold certain kinds of CMMC Level 2 CUI will just need to do a self-assessment while others who hold more sensitive Level 2 data will need to get a third-party assessment.nn\u201cWhat we are working through with those tabletop exercises that we're working on today is going to ferret out where we feel that we can bifurcate Level 2, where we have prioritized and non-prioritized CUI. If you are a company that has federal contract information (FCI), you got to do the 15 [security controls], you got to do that annual self assessment and affirm that you're compliant to those. Then, you would have to do the self assessment once every three years,\u201d she said. \u201cWhat we've said when we started looking at the universe of companies because there's <a href="https:\/\/federalnewsnetwork.com\/cybersecurity\/2022\/02\/more-companies-may-have-to-get-a-cmmc-assessment-after-all\/">about 80,000 companies<\/a> that are anticipated to be CUI holders, and undoubtedly you will not be bidding on just one contract, I think the thought process is eventually everybody will end up wanting to participate on a procurement that needs a third-party certification. But if you're lucky enough to be only in receipt of non-prioritized CUI, you'll be able to do that.\u201dnnAs for FCI, the National Archives and Records Administration\u2019s Information Security Oversight Office wrote in a <a href="https:\/\/isoo.blogs.archives.gov\/2020\/06\/19\/%E2%80%8Bfci-and-cui-what-is-the-difference\/" target="_blank" rel="noopener">2020 blog<\/a> that \u201cthe definition of FCI which mentions information that is 'provided by or generated for the government under a contract to develop or deliver a product or service to the government.' In other words, FCI is more about what the government gives to you as part of the contract or what you create for them under the contract, while CUI protected under the General Procurement and Acquisition category is mostly proprietary information and sensitive information that is provided to the government and protected throughout the contracting\/award process.\u201dn<h2>Securing data is the ultimate goal<\/h2>nBy shifting to CMMC 2.0, DoD hopes the process to secure CUI and other more sensitive data will be less onerous and more streamlined than version 1.0. In this second iteration, DoD combined five levels into three and focused on the type of data that vendors must protect and reduced the requirement for third-party assessments.nnDr. Kelly Fletcher, the principal deputy DoD CIO, said at the AFCEA NoVA event that the Pentagon understands what they are asking contractors to take on isn\u2019t easy and will require time and patience. DoD estimates it could take <a href="https:\/\/federalnewsnetwork.com\/defense-news\/2021\/11\/cmmc-2-0-could-take-as-long-as-two-years-to-come-online\/">two years<\/a> from when the final rule is out in mid-2023 for CMMC to hit full operational capability.nnFletcher said as the program rolls out over the next year DoD, the ultimate goal is for contractors to do more to protect their network, systems and data, most immediately ensuring they meet the cybersecurity practices laid out by the National Institute of Standards and Technology in Special Publication 800-171.nn\u201cWe're writing a proposed rule change to Sections 32 and 48 of the Code of Federal Regulations (CFR). That is happening behind the scenes, there are people working super, super hard on this, but you are not going to see it. We, the DoD CIO, we're going to submit these draft rules to the Office of Management and Budget (OMB), and then they're going to enter into the rulemaking process,\u201d Fletcher said. \u201cWe think that the rule will be published for public comment in March 2023. The reason that's really important is this is public comment. You have the opportunity to comment, and we want your comments. We want you to say \u2018this is too onerous. This is expensive. This isn't onerous enough.\u2019 We want those comments, and the way that you can do that is through the OMB website, and that'll be in March about.\u201dnnAfter the proposed rule is out, and it\u2019s likely to be an interim rule with a request for comments, DoD will start adding CMMC requirements to contracts.nnFletcher said the CIO\u2019s office is strongly encouraging contracting offices to release requests for information and other pre-solicitation notices if the upcoming request for proposal will include CMMC requirements.nn\u201cThe way that you will know that CMMC is required is when you look at an RFI and RFP or solicitation it's going to tell you very clearly, CMMC certification is required at this level. So it'll never be a surprise, and it's not going to be backwards compatible,\u201d she said.n<h2>Project Spectrum focuses on small businesses<\/h2>nFletcher added that companies, especially small firms, should begin as soon as possible to prepare for CMMC no matter the level.nnDoD is providing help to small companies through Project Spectrum.nnKareem Sykes, the program manager of <a href="https:\/\/www.projectspectrum.io\/">Project Spectrum<\/a>, said the initiative aims to help small firms, through their Mentor-Prot\u00e9g\u00e9 relationships improve cybersecurity and meet CMMC standards.nnHe said the current pilot effort has about 13 companies but they want to expand it over the next year.nn\u201cAs we get companies in, we educate them about what the [CMMC] requirements are and how it relates to them as a company. We get an idea of what they're doing in the space and an understanding what their goals are. Are they looking for Level 1, Level 2 or Level 3? Once they have an understanding of what those encompass, we talk about cyber curriculum development. A big piece of what we do is cyber coursework,\u201d Sykes said at the AFCEA NoVA event. \u201cI heard a lot of talk about plan of action and milestones (POA&Ms), which obviously, under the 2.0 model will be more time bound and where companies have a little more leeway and flexibility. So to that end, as it relates to courses, we have a POA&M course, we're hot and heavy and developing and it should be very out very, very soon.\u201dnnSykes said participating companies need both a mentor-prot\u00e9g\u00e9 agreement and a sponsoring DoD agency.nnHe said the first step for companies who want to participate is to apply for the program and complete a cyber readiness check, specifically with a focus on NIST 800-171 and CMMC requirements.nnOnce they are accepted into the program, then comes the training based on the results of the readiness check.nn\u201cWe're going to schedule a tech review call, want to make sure that they, in no uncertain terms, know they have access to our live cyber advisors, and then within two business days of that call, that's when we get into the actual customized compliance plan or the training plan,\u201d he said. \u201cThey have actionable data and information from which to move forward in their journey.\u201dn<h2>Incentives for contractors<\/h2>nAdditionally, Congress approved, but have not funded, DoD to provide grants or loans to companies to meet the CMMC requirements. Bostjanick said there are constant conversations about how best to help small firms improve their cyber postures.nnIn the meantime, companies can help offset the costs of CMMC in other ways.nn\u201cCMMC is an allowable cost. It's a cost of doing business,\u201d she said. \u201cYou can include that in your overhead and general and accounting rates to be able to recoup the cost that you've spent implementing CMMC.\u201dnnDoD also is hiring more assessors through the Defense Contract Management Agency (DCMA). Bostjanick said DCMA received funding to hire 140 new assessors for the Defense Industrial Base Cybersecurity Assessment Center (DIBCAC) team, and has started to bring them on board.nn\u201c[DCMA wants to] make sure that they have the capacity to handle the CMMC Level 3 certifications. They are also the ones that are doing the CMMC Level 2 certifications for the certified third-party assessment organizations (C-3PAOs),\u201d she said. \u201cWe are quite confident that the DIBCAC does have the bandwidth and the capability to handle anything that we are going to need in the future.\u201dnnFletcher said despite the challenges with CMMC 1.0 and the move to 2.0, companies are understanding today more than ever why they must do a better job securing data.nn\u201cBy the summer, if you know that you're CMMC compliant, if you feel confident in your networks and you've done, perhaps, some of this early actions, you're going to be super well postured for when you're going to start seeing RFIs and RFPs that call for CMMC requirements\u201d she said. \u201cIf it were me, I would want to be some one of the early adopters, and I think it's going get rid of a lot of competition, although I could be wrong. I think not in the long term. But certainly initially.\u201d"}};

Keep an eye on the Defense Department this summer to see just how well designers of version 2.0 of the Cybersecurity Maturity Model Certification (CMMC) did.

The first real test of the revamped effort comes in the form of tabletop exercises in mid-to-late June or early July.

Stacy Bostjanick is the chief of implementation and policy in the DoD’s Office of the DoD Chief Information Officer.

“We’re going to be doing some tabletop exercises where we actually fabricate a program and walk the dog, making sure we look at a proposal, that we’re looking for the right information and we’re going to have members from the Defense Industrial Base (DIB) sector participate with us,” said Stacy Bostjanick, the chief of implementation and policy in the DoD’s Office of the DoD Chief Information Officer, at the AFCEA NOVA Small Business IT Day event on May 5. “We want to hear from your perspective, whether that’s a bridge too far or that’s too hard for us. We need to read wicker it a different way to make sure that what we put together is an executable program that people can participate in and understand from the get go what your requirements are and how you have to manage and handle things.”

DoD revamped its approach to securing controlled unclassified information (CUI) in November and is going through the federal rulemaking process. These tabletop exercises, Bostjanick said, are an important step toward DoD’s goal of launching CMMC in 2023.

One big area of interest for DoD is the use of CUI. Bostjanick said contractors who hold certain kinds of CMMC Level 2 CUI will just need to do a self-assessment while others who hold more sensitive Level 2 data will need to get a third-party assessment.

“What we are working through with those tabletop exercises that we’re working on today is going to ferret out where we feel that we can bifurcate Level 2, where we have prioritized and non-prioritized CUI. If you are a company that has federal contract information (FCI), you got to do the 15 [security controls], you got to do that annual self assessment and affirm that you’re compliant to those. Then, you would have to do the self assessment once every three years,” she said. “What we’ve said when we started looking at the universe of companies because there’s about 80,000 companies that are anticipated to be CUI holders, and undoubtedly you will not be bidding on just one contract, I think the thought process is eventually everybody will end up wanting to participate on a procurement that needs a third-party certification. But if you’re lucky enough to be only in receipt of non-prioritized CUI, you’ll be able to do that.”

As for FCI, the National Archives and Records Administration’s Information Security Oversight Office wrote in a 2020 blog that “the definition of FCI which mentions information that is ‘provided by or generated for the government under a contract to develop or deliver a product or service to the government.’ In other words, FCI is more about what the government gives to you as part of the contract or what you create for them under the contract, while CUI protected under the General Procurement and Acquisition category is mostly proprietary information and sensitive information that is provided to the government and protected throughout the contracting/award process.”

Securing data is the ultimate goal

By shifting to CMMC 2.0, DoD hopes the process to secure CUI and other more sensitive data will be less onerous and more streamlined than version 1.0. In this second iteration, DoD combined five levels into three and focused on the type of data that vendors must protect and reduced the requirement for third-party assessments.

Dr. Kelly Fletcher, the principal deputy DoD CIO, said at the AFCEA NoVA event that the Pentagon understands what they are asking contractors to take on isn’t easy and will require time and patience. DoD estimates it could take two years from when the final rule is out in mid-2023 for CMMC to hit full operational capability.

Fletcher said as the program rolls out over the next year DoD, the ultimate goal is for contractors to do more to protect their network, systems and data, most immediately ensuring they meet the cybersecurity practices laid out by the National Institute of Standards and Technology in Special Publication 800-171.

“We’re writing a proposed rule change to Sections 32 and 48 of the Code of Federal Regulations (CFR). That is happening behind the scenes, there are people working super, super hard on this, but you are not going to see it. We, the DoD CIO, we’re going to submit these draft rules to the Office of Management and Budget (OMB), and then they’re going to enter into the rulemaking process,” Fletcher said. “We think that the rule will be published for public comment in March 2023. The reason that’s really important is this is public comment. You have the opportunity to comment, and we want your comments. We want you to say ‘this is too onerous. This is expensive. This isn’t onerous enough.’ We want those comments, and the way that you can do that is through the OMB website, and that’ll be in March about.”

After the proposed rule is out, and it’s likely to be an interim rule with a request for comments, DoD will start adding CMMC requirements to contracts.

Fletcher said the CIO’s office is strongly encouraging contracting offices to release requests for information and other pre-solicitation notices if the upcoming request for proposal will include CMMC requirements.

“The way that you will know that CMMC is required is when you look at an RFI and RFP or solicitation it’s going to tell you very clearly, CMMC certification is required at this level. So it’ll never be a surprise, and it’s not going to be backwards compatible,” she said.

Project Spectrum focuses on small businesses

Fletcher added that companies, especially small firms, should begin as soon as possible to prepare for CMMC no matter the level.

DoD is providing help to small companies through Project Spectrum.

Kareem Sykes, the program manager of Project Spectrum, said the initiative aims to help small firms, through their Mentor-Protégé relationships improve cybersecurity and meet CMMC standards.

He said the current pilot effort has about 13 companies but they want to expand it over the next year.

“As we get companies in, we educate them about what the [CMMC] requirements are and how it relates to them as a company. We get an idea of what they’re doing in the space and an understanding what their goals are. Are they looking for Level 1, Level 2 or Level 3? Once they have an understanding of what those encompass, we talk about cyber curriculum development. A big piece of what we do is cyber coursework,” Sykes said at the AFCEA NoVA event. “I heard a lot of talk about plan of action and milestones (POA&Ms), which obviously, under the 2.0 model will be more time bound and where companies have a little more leeway and flexibility. So to that end, as it relates to courses, we have a POA&M course, we’re hot and heavy and developing and it should be very out very, very soon.”

Sykes said participating companies need both a mentor-protégé agreement and a sponsoring DoD agency.

He said the first step for companies who want to participate is to apply for the program and complete a cyber readiness check, specifically with a focus on NIST 800-171 and CMMC requirements.

Once they are accepted into the program, then comes the training based on the results of the readiness check.

“We’re going to schedule a tech review call, want to make sure that they, in no uncertain terms, know they have access to our live cyber advisors, and then within two business days of that call, that’s when we get into the actual customized compliance plan or the training plan,” he said. “They have actionable data and information from which to move forward in their journey.”

Incentives for contractors

Additionally, Congress approved, but have not funded, DoD to provide grants or loans to companies to meet the CMMC requirements. Bostjanick said there are constant conversations about how best to help small firms improve their cyber postures.

In the meantime, companies can help offset the costs of CMMC in other ways.

“CMMC is an allowable cost. It’s a cost of doing business,” she said. “You can include that in your overhead and general and accounting rates to be able to recoup the cost that you’ve spent implementing CMMC.”

DoD also is hiring more assessors through the Defense Contract Management Agency (DCMA). Bostjanick said DCMA received funding to hire 140 new assessors for the Defense Industrial Base Cybersecurity Assessment Center (DIBCAC) team, and has started to bring them on board.

“[DCMA wants to] make sure that they have the capacity to handle the CMMC Level 3 certifications. They are also the ones that are doing the CMMC Level 2 certifications for the certified third-party assessment organizations (C-3PAOs),” she said. “We are quite confident that the DIBCAC does have the bandwidth and the capability to handle anything that we are going to need in the future.”

Fletcher said despite the challenges with CMMC 1.0 and the move to 2.0, companies are understanding today more than ever why they must do a better job securing data.

“By the summer, if you know that you’re CMMC compliant, if you feel confident in your networks and you’ve done, perhaps, some of this early actions, you’re going to be super well postured for when you’re going to start seeing RFIs and RFPs that call for CMMC requirements” she said. “If it were me, I would want to be some one of the early adopters, and I think it’s going get rid of a lot of competition, although I could be wrong. I think not in the long term. But certainly initially.”

]]>
https://federalnewsnetwork.com/cybersecurity/2022/05/tabletop-exercises-to-put-cmmc-2-0-through-the-paces/feed/ 0
Outgoing intelligence community data chief previews forthcoming data strategy https://federalnewsnetwork.com/inside-ic/2022/05/outgoing-intelligence-community-data-chief-previews-forthcoming-data-strategy/ https://federalnewsnetwork.com/inside-ic/2022/05/outgoing-intelligence-community-data-chief-previews-forthcoming-data-strategy/#respond Fri, 06 May 2022 18:01:40 +0000 https://federalnewsnetwork.com/?p=4047415 var config_4043870 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/dts.podtrac.com\/redirect.mp3\/pdst.fm\/e\/chrt.fm\/track\/E2G895\/podone.noxsolutions.com\/media\/2252\/episodes\/050422_InsideTheIC_FullEpisode_Mixdown_ien8.mp3"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2022\/02\/183879-image-1644619204-150x150.jpg","title":"A conversation with the intelligence community’s chief data officer","description":"[hbidcpodcast podcastid='4043870']nnThe intelligence community is drafting a new data strategy for the first time since 2017, with a big focus on training a data savvy workforce well equipped to take advantage of an increasing deluge of information that intelligence agencies are both collecting and producing.nnIntelligence agencies have made \u201cgreat strides\u201d since the first data strategy was published in 2017, according to Nancy Morgan, who just retired as chief data officer of the intelligence community. Her last day was April 29, and the office of the director of national intelligence has yet to select her replacement.nn\u201cWe've made some significant improvements to what we've been doing with data lifecycle management since the first IC data strategy was published in 2017,\u201d Morgan said in an April 28 interview on <a href="https:\/\/www.podcastone.com\/pd\/All-About-Data">All About Data<\/a>\u00a0and <a href="https:\/\/www.podcastone.com\/pd\/Inside-the-IC">Inside the IC.<\/a> \u201cWe feel we've done a lot of work to enhance sharing and safeguarding, but there's still more to do.\u201dnnChief data officers across the 18 intelligence agencies are focused on using automation to do more data preparation, Morgan said. The goal is to give analysts more time to \u201cdo higher order tasks\u201d rather than rudimentary jobs like data tagging.nn\u201cWe're collecting and producing more information than ever before, the IC is launching more collection capabilities than ever before at astounding volumes, certainly since I began my career 30-plus years ago,\u201d Morgan said. \u201cIt's just astounding how much information we're gathering. So it creates a data volume challenge.\u201dnnODNI is also updating the IC IT Enterprise, or \u201cICITE,\u201d strategy, a major guiding document for how intelligence agencies will use computing in the years ahead. The work is being led by Adele Merritt, the chief information officer for the intelligence community.nnThe new IT strategy will be pivotal to \u201cenhance the critical data management capabilities to achieve our goals,\u201d Morgan said.nnCDOs in the intelligence community are also looking to create more interoperability across intel agencies and the broader Defense Department. Morgan said leaders want to share successful approaches across organizations.nn"How do we integrate and involve multidisciplinary approaches that solve the IC's most challenging and emerging data issues?" she said. "We find new data challenges\u00a0 every day in every domain area."nnBut beyond technology, a major piece of the forthcoming data strategy is the workforce. Morgan said spy agencies aren\u2019t just focused on bringing in highly sought-after data scientists, but also training the existing workforce to be more data savvy.nn\u201cHow do we increase the data acumen and tradecraft, by not only attracting but developing, growing and resourcing the data savvy workforce?\u201d she said. \u201cSo not just the talent we recruit, but the workforce we already have. How do we give people a chance to develop new skills and make them even more powerful and valuable to the community?\u201dn<h2>Career pivots<\/h2>nIC data leaders are looking to create opportunities for intelligence professionals to start learning new skills related to digital technologies, data and cybersecurity, according to Morgan.nn\u201cIt's really very powerful when our domain experts learn some of the foundational skills for working with technology, working with automation, working with artificial intelligence, machine learning, being paired up with data scientists and data engineers,\u201d she said.nnThe focus isn\u2019t just on developing data professionals, but on building data aptitude across mission, business and policy areas, including acquisition, contracting, privacy and civil liberties, legal divisions and finance, according to Morgan.nn\u201cFrankly, it's about supervisors, managers, leaders, senior executives at all levels of the organization,\u201d she said. \u201cAre we asking the right questions about data when it's presented to us? Do we understand the data that's driving our decision making and we say the words data driven decision making but how are we actually putting that into practice?\u201dnnMorgan noted the Fiscal Year 2022 National Defense Authorization Act requires the Office of Personnel Management to establish new occupational series for not just \u201cdata science,\u201d but \u201cdata management\u201d as well.nn\u201cI was really proud of helping influence some of the wording on that, because while I absolutely want to have a strong data science cadre, you need the full data management realm,\u201d she said. \u201cYou need data managers, data policy experts, in addition to those data scientists and those data engineers.\u201dnnODNI is also preparing to conduct the pilot phase of a new public-private talent exchange. It will allow intelligence officers to work temporarily in the private sector, and vice versa. The pilot phase will allow for six-month details, according to Morgan.nnThe pilot phase will include specific focus areas, including professionals working in data, as well as a category for artificial intelligence and machine learning, according to Morgan.nn\u201cLaunching the pilot is a bit complicated, working through some of the security issues working through some of the acquisition and legal issues,\u201d she said. \u201cBut our goal is really to help intelligence officers and private sector colleagues better understand each other's mission, landscape, inject diverse thinking and gain new insights and really, hopefully create a more two way flow of talent skills and ideas.\u201dnnShe also said it could help inculcate a culture where there\u2019s more back-and-forth between the government and private sector.nn\u201cI don't know that people will have the same sort of trajectory of a career that's more only in the government or only in the private sector,\u201d Morgan said. \u201cI hope we'll see more two-way movement and more continuous movement over the time of someone's career. And again, selfishly, for me, this helps us grow our digital data and cyber savvy workforce with real world experiences.\u201d"}};

The intelligence community is drafting a new data strategy for the first time since 2017, with a big focus on training a data savvy workforce well equipped to take advantage of an increasing deluge of information that intelligence agencies are both collecting and producing.

Intelligence agencies have made “great strides” since the first data strategy was published in 2017, according to Nancy Morgan, who just retired as chief data officer of the intelligence community. Her last day was April 29, and the office of the director of national intelligence has yet to select her replacement.

“We’ve made some significant improvements to what we’ve been doing with data lifecycle management since the first IC data strategy was published in 2017,” Morgan said in an April 28 interview on All About Data and Inside the IC. “We feel we’ve done a lot of work to enhance sharing and safeguarding, but there’s still more to do.”

Chief data officers across the 18 intelligence agencies are focused on using automation to do more data preparation, Morgan said. The goal is to give analysts more time to “do higher order tasks” rather than rudimentary jobs like data tagging.

“We’re collecting and producing more information than ever before, the IC is launching more collection capabilities than ever before at astounding volumes, certainly since I began my career 30-plus years ago,” Morgan said. “It’s just astounding how much information we’re gathering. So it creates a data volume challenge.”

ODNI is also updating the IC IT Enterprise, or “ICITE,” strategy, a major guiding document for how intelligence agencies will use computing in the years ahead. The work is being led by Adele Merritt, the chief information officer for the intelligence community.

The new IT strategy will be pivotal to “enhance the critical data management capabilities to achieve our goals,” Morgan said.

CDOs in the intelligence community are also looking to create more interoperability across intel agencies and the broader Defense Department. Morgan said leaders want to share successful approaches across organizations.

“How do we integrate and involve multidisciplinary approaches that solve the IC’s most challenging and emerging data issues?” she said. “We find new data challenges  every day in every domain area.”

But beyond technology, a major piece of the forthcoming data strategy is the workforce. Morgan said spy agencies aren’t just focused on bringing in highly sought-after data scientists, but also training the existing workforce to be more data savvy.

“How do we increase the data acumen and tradecraft, by not only attracting but developing, growing and resourcing the data savvy workforce?” she said. “So not just the talent we recruit, but the workforce we already have. How do we give people a chance to develop new skills and make them even more powerful and valuable to the community?”

Career pivots

IC data leaders are looking to create opportunities for intelligence professionals to start learning new skills related to digital technologies, data and cybersecurity, according to Morgan.

“It’s really very powerful when our domain experts learn some of the foundational skills for working with technology, working with automation, working with artificial intelligence, machine learning, being paired up with data scientists and data engineers,” she said.

The focus isn’t just on developing data professionals, but on building data aptitude across mission, business and policy areas, including acquisition, contracting, privacy and civil liberties, legal divisions and finance, according to Morgan.

“Frankly, it’s about supervisors, managers, leaders, senior executives at all levels of the organization,” she said. “Are we asking the right questions about data when it’s presented to us? Do we understand the data that’s driving our decision making and we say the words data driven decision making but how are we actually putting that into practice?”

Morgan noted the Fiscal Year 2022 National Defense Authorization Act requires the Office of Personnel Management to establish new occupational series for not just “data science,” but “data management” as well.

“I was really proud of helping influence some of the wording on that, because while I absolutely want to have a strong data science cadre, you need the full data management realm,” she said. “You need data managers, data policy experts, in addition to those data scientists and those data engineers.”

ODNI is also preparing to conduct the pilot phase of a new public-private talent exchange. It will allow intelligence officers to work temporarily in the private sector, and vice versa. The pilot phase will allow for six-month details, according to Morgan.

The pilot phase will include specific focus areas, including professionals working in data, as well as a category for artificial intelligence and machine learning, according to Morgan.

“Launching the pilot is a bit complicated, working through some of the security issues working through some of the acquisition and legal issues,” she said. “But our goal is really to help intelligence officers and private sector colleagues better understand each other’s mission, landscape, inject diverse thinking and gain new insights and really, hopefully create a more two way flow of talent skills and ideas.”

She also said it could help inculcate a culture where there’s more back-and-forth between the government and private sector.

“I don’t know that people will have the same sort of trajectory of a career that’s more only in the government or only in the private sector,” Morgan said. “I hope we’ll see more two-way movement and more continuous movement over the time of someone’s career. And again, selfishly, for me, this helps us grow our digital data and cyber savvy workforce with real world experiences.”

]]>
https://federalnewsnetwork.com/inside-ic/2022/05/outgoing-intelligence-community-data-chief-previews-forthcoming-data-strategy/feed/ 0
NGA looks to speed up software development with key metrics, ‘CORE’ capabilities https://federalnewsnetwork.com/intelligence-community/2022/04/nga-looks-to-speed-up-software-development-with-key-metrics-core-capabilities/ https://federalnewsnetwork.com/intelligence-community/2022/04/nga-looks-to-speed-up-software-development-with-key-metrics-core-capabilities/#respond Fri, 29 Apr 2022 18:50:15 +0000 https://federalnewsnetwork.com/?p=4035109 var config_4034846 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/dts.podtrac.com\/redirect.mp3\/pdst.fm\/e\/chrt.fm\/track\/E2G895\/aw.noxsolutions.com\/launchpod\/federal-drive\/mp3\/042922_Justin_web_dnmj_306c40c7.mp3?awCollectionId=1146&awEpisodeId=8b2da77b-b2cc-4ce2-8861-67e5306c40c7&awNetwork=322"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2018\/12\/FD1500-150x150.jpg","title":"The NGA Software Way could improve the agency’s development process","description":"[hbidcpodcast podcastid='4034846']nnThe National Geospatial-Intelligence Agency is looking to deliver software more like the tech industry under a new strategy that sets key metrics for both internal development teams and contractors.nn<a href="https:\/\/www.nga.mil\/assets\/files\/The_NGA_Software_Way.pdf">\u201cThe NGA Software Way\u201d<\/a> lays out how the agency envisions delivering software faster and more consistently, as NGA\u2019s technology priorities increasingly revolve around software-enabled capabilities like automation and machine learning.nnOfficials believe automation, artificial intelligence and machine learning <a href="https:\/\/federalnewsnetwork.com\/space-operations\/2021\/10\/nga-looks-to-corral-satellite-imagery-other-data-in-push-for-synthetic-persistence\/">will be key at NGA<\/a> to analyzing a rapidly increasing volume of satellite imagery and other geospatial intelligence data that could overwhelm human analysts. NGA also recently took over Project Maven, <a href="https:\/\/federalnewsnetwork.com\/intelligence-community\/2022\/04\/pentagon-shifting-project-maven-marquee-artificial-intelligence-initiative-to-nga\/">a major AI program<\/a> that\u2019s been at the forefront of the Pentagon\u2019s recent software development projects.nnNGA\u2019s new software strategy describes three key metrics as \u201cavailability,\u201d \u201clead time for changes,\u201d and \u201cdeployment frequency.\u201d Each individual software product will have its own \u201cproduct-specific metrics\u201d as well, tailored to track how well the software is working for its users.nn\u201cWe put this out for really anyone delivering software at NGA,\u201d NGA Chief Technology Officer Alex Loehr said. \u201cThat could be government employees, industry, even commercial products that NGA is buying. There are significant parts of the software way that relate to how we want to work with those companies. And so we hope that this will set common expectations of how we can deliver useful software faster and for our mission.\u201dnnThe software strategy complements the NGA\u2019s recently released technology focus areas. The big priorities include assured positioning, navigation, timing and targeting; accelerated tasking orchestration; data access and data integrity; and analytic workflow modernization.nnLoehr said the software strategy is an \u201cimplementation guide\u201d for NGA\u2019s technology focus areas.nn\u201cIf the tech focus areas are the \u2018what,\u2019 the Software Way is \u2018how,\u2019\u201d he said.nnNGA wrote the \u201cSoftware Way\u201d based off of several existing documents, including the U.S. Digital Service\u2019s \u201cDigital Services Playbook,\u201d as well as the U.K. government\u2019s \u201cService Standard,\u201d according to Loehr. The agency also looked to research and data from industry, specifically from the DevOps Research and Assessment, or \u201cDORA,\u201d a company owned by Google\u2019s parent company, Alphabet.nnLoehr said NGA took best practices from those documents and used them as a foundation for the software strategy, while taking into account the more unique needs of an intelligence agency.nn\u201cSome of those other documents are much more about citizen facing services,\u201d he said. \u201cAt NGA, we do have some of those, but not everything we do is open and public. And so some of the elements from those other documents didn't fit exactly, but we were able to build off the core of those documents in order to learn from those who came before and did a lot of really hard work and grow in a way that matches what we need at NGA.\u201dnnNGA published an initial version of the document last year and received more than 300 pages of responses from 47 companies.nn\u201cWe got some feedback around things that were unclear, that didn't make sense, as well as lessons that we learned about how we need to work at NGA and work with our industry partners to make this document successful,\u201d Loehr said. \u201cSome of that didn't make it into the words of the document itself, but did start driving some work we're doing to make sure that as we implement the NGA Software Way, we're able to do it successfully.\u201dn<h2>CORE developments<\/h2>nTo help meet the goals of the strategy, NGA has established a Common Operating Release Environment, called \u201cCORE,\u201d to provide development teams with enterprise software delivery tools like version control, testing, and tracking and collaboration tools.nn\u201cHistorically, we've let different teams choose their tools and their different processes of how they build software,\u201d Loehr said. \u201cThat led to some really important things, but it also led to a lot of fragmentation. And what we're trying to do is build one set of tooling and one set of processes.\u201dnnMany pieces of CORE are already in place and being used by mission critical applications in some cases, according to Loehr, including version control, the \u201cCI\/CD\u201d pipeline, an API developer portal, and issue tracking and documentation spaces.nnEnterprise workflow orchestration and messaging tools, respectively, are still \u201cmore in the beta phase,\u201d Loehr said.nn\u201cThe core of the CORE around the version control, the pipeline, the developer portal, all that is live, real and being used today,\u201d he said. \u201cAnd we are looking at growing that usage pretty significantly.\u201dnnSeveral years ago, NGA began developing an in-house software developer corp. Now, NGA is also looking to build out a key competency in the form of product managers who can shepherd a software project through development successfully.nn"The person that acts as the interface between those end users and the development team and understands the vision for the product, creates the roadmap and makes sure that what is being built is actually both useful and actually used," Loehr said. "That's been a discipline that we are bringing into NGA, and then that we're helping grow. I think will be really important for our future on how we make sure that we are building not just any software, but the right software, and it's actually delivering on our mission."n<h2>\u2018Build low, push high\u2019<\/h2>nNGA is also increasingly developing its software in unclassified environments, called the \u201clow side\u201d in intelligence jargon, before it\u2019s pushed to the \u201chigh side,\u201d or a classified environment. The concept is \u201cbuild low, push high,\u201d according to Loehr.nn\u201cA lot of our workforce, and our contractor workforce doesn't want to be in a [Sensitive Compartmented Information Facility] every day,\u201d he said. \u201cAnd also a lot of our software itself isn't necessarily classified. The data that's in it might be classified, and often not in all cases, but often our software isn't.\u201dnnThe CORE tooling includes the ability to sync software versions across classified and unclassified domains, Loehr said, a key process for speeding up development.nn\u201cThose process pieces are almost just as important as technology pieces,\u201d he said. \u201cAnd enabling us to build low and move high, I think will help us move faster and really increase the diversity that we're able to have in the people working on our products and how that work gets done.\u201d"}};

The National Geospatial-Intelligence Agency is looking to deliver software more like the tech industry under a new strategy that sets key metrics for both internal development teams and contractors.

“The NGA Software Way” lays out how the agency envisions delivering software faster and more consistently, as NGA’s technology priorities increasingly revolve around software-enabled capabilities like automation and machine learning.

Officials believe automation, artificial intelligence and machine learning will be key at NGA to analyzing a rapidly increasing volume of satellite imagery and other geospatial intelligence data that could overwhelm human analysts. NGA also recently took over Project Maven, a major AI program that’s been at the forefront of the Pentagon’s recent software development projects.

NGA’s new software strategy describes three key metrics as “availability,” “lead time for changes,” and “deployment frequency.” Each individual software product will have its own “product-specific metrics” as well, tailored to track how well the software is working for its users.

“We put this out for really anyone delivering software at NGA,” NGA Chief Technology Officer Alex Loehr said. “That could be government employees, industry, even commercial products that NGA is buying. There are significant parts of the software way that relate to how we want to work with those companies. And so we hope that this will set common expectations of how we can deliver useful software faster and for our mission.”

The software strategy complements the NGA’s recently released technology focus areas. The big priorities include assured positioning, navigation, timing and targeting; accelerated tasking orchestration; data access and data integrity; and analytic workflow modernization.

Loehr said the software strategy is an “implementation guide” for NGA’s technology focus areas.

“If the tech focus areas are the ‘what,’ the Software Way is ‘how,’” he said.

NGA wrote the “Software Way” based off of several existing documents, including the U.S. Digital Service’s “Digital Services Playbook,” as well as the U.K. government’s “Service Standard,” according to Loehr. The agency also looked to research and data from industry, specifically from the DevOps Research and Assessment, or “DORA,” a company owned by Google’s parent company, Alphabet.

Loehr said NGA took best practices from those documents and used them as a foundation for the software strategy, while taking into account the more unique needs of an intelligence agency.

“Some of those other documents are much more about citizen facing services,” he said. “At NGA, we do have some of those, but not everything we do is open and public. And so some of the elements from those other documents didn’t fit exactly, but we were able to build off the core of those documents in order to learn from those who came before and did a lot of really hard work and grow in a way that matches what we need at NGA.”

NGA published an initial version of the document last year and received more than 300 pages of responses from 47 companies.

“We got some feedback around things that were unclear, that didn’t make sense, as well as lessons that we learned about how we need to work at NGA and work with our industry partners to make this document successful,” Loehr said. “Some of that didn’t make it into the words of the document itself, but did start driving some work we’re doing to make sure that as we implement the NGA Software Way, we’re able to do it successfully.”

CORE developments

To help meet the goals of the strategy, NGA has established a Common Operating Release Environment, called “CORE,” to provide development teams with enterprise software delivery tools like version control, testing, and tracking and collaboration tools.

“Historically, we’ve let different teams choose their tools and their different processes of how they build software,” Loehr said. “That led to some really important things, but it also led to a lot of fragmentation. And what we’re trying to do is build one set of tooling and one set of processes.”

Many pieces of CORE are already in place and being used by mission critical applications in some cases, according to Loehr, including version control, the “CI/CD” pipeline, an API developer portal, and issue tracking and documentation spaces.

Enterprise workflow orchestration and messaging tools, respectively, are still “more in the beta phase,” Loehr said.

“The core of the CORE around the version control, the pipeline, the developer portal, all that is live, real and being used today,” he said. “And we are looking at growing that usage pretty significantly.”

Several years ago, NGA began developing an in-house software developer corp. Now, NGA is also looking to build out a key competency in the form of product managers who can shepherd a software project through development successfully.

“The person that acts as the interface between those end users and the development team and understands the vision for the product, creates the roadmap and makes sure that what is being built is actually both useful and actually used,” Loehr said. “That’s been a discipline that we are bringing into NGA, and then that we’re helping grow. I think will be really important for our future on how we make sure that we are building not just any software, but the right software, and it’s actually delivering on our mission.”

‘Build low, push high’

NGA is also increasingly developing its software in unclassified environments, called the “low side” in intelligence jargon, before it’s pushed to the “high side,” or a classified environment. The concept is “build low, push high,” according to Loehr.

“A lot of our workforce, and our contractor workforce doesn’t want to be in a [Sensitive Compartmented Information Facility] every day,” he said. “And also a lot of our software itself isn’t necessarily classified. The data that’s in it might be classified, and often not in all cases, but often our software isn’t.”

The CORE tooling includes the ability to sync software versions across classified and unclassified domains, Loehr said, a key process for speeding up development.

“Those process pieces are almost just as important as technology pieces,” he said. “And enabling us to build low and move high, I think will help us move faster and really increase the diversity that we’re able to have in the people working on our products and how that work gets done.”

]]>
https://federalnewsnetwork.com/intelligence-community/2022/04/nga-looks-to-speed-up-software-development-with-key-metrics-core-capabilities/feed/ 0