Insight of the Month – Federal News Network https://federalnewsnetwork.com Helping feds meet their mission. Tue, 01 Feb 2022 19:56:45 +0000 en-US hourly 1 https://federalnewsnetwork.com/wp-content/uploads/2017/12/cropped-icon-512x512-1-60x60.png Insight of the Month – Federal News Network https://federalnewsnetwork.com 32 32 5G is changing the way the military connects its bases and its people https://federalnewsnetwork.com/federal-insights/2022/02/federal-insights-smart-base-of-the-future/ https://federalnewsnetwork.com/federal-insights/2022/02/federal-insights-smart-base-of-the-future/#respond Tue, 01 Feb 2022 16:00:59 +0000 https://federalnewsnetwork.com/?p=3545656 As 5G begins to roll out in civilian and military spaces, the Defense Department is testing 5G networks at a handful of bases to ensure connectivity and security.

Since last year, the Pentagon has had contracts for 5G experimentation at nearly a dozen different installations to make “smart bases.”

“In essence, a smart base of the future is the integration of connected technologies that will fundamentally improve the performance and efficiency of assets and services across a military installation,” said Cornelius Brown, Verizon’s Department of Defense sales director, during the Federal Insights discussion Smart Base of the Future, sponsored by Verizon. “As we define smart bases, we can essentially view them as mini cities in itself, where infrastructure, building transportation, energy management, are all factors of a city and a base. What drives a smart base is that they’re all hyper connected, it’s an ecosystem where everything becomes connected.”

That kind of system is what the Pentagon is aiming for when it comes to its future weapons systems, artificial intelligence and how it wants everything to be interconnected with data and interoperability.

“There’s a large collaboration with multiple industry players to really enable various use cases around energy management, drone management, autonomous vehicles, and it’s really a collaborative effort to really prove out what the customer wants to accomplish,” Brown said. “5G brings mobile edge computing closer to the end user, or the application via the cloud. This really removes a lot of historical latency that prevented us from applications being able to make near real time decisions.”

It’s not just the Pentagon’s weapons systems that will see changes with 5G. Military personnel will also see their connected lives move in a different direction as well.

“For military personnel living on the base, they’ll be able to take an autonomous shuttle or bus that will get them around, all that cool stuff,” Bryan Schromsky, a managing partner for 5G public sector at Verizon said. “More importantly they’ll see it in connection. If they’re using a video calling platform on these connected devices, they’ll just have a much better experience. They’ll be in high definition video, they’ll have faster bit rates, so they can do more videos. They can upload information, all of that good stuff and couldn’t ask for much more to have a very intimate and immersive experience in a personal way.”

With that new power and ease also comes new responsibilities and challenges, however.

By adding more devices to the network, it creates more opportunities for malicious actors to get into the network. DoD will need to protect its assets from outside attacks like hackers and inside attacks like a faulty supply chain.

“Having a coordinated security policy is the right approach  ,” Schromsky said. “We want to make sure we have a valid, secure supply chain and also work with industry standards. Most importantly we want to have outreach in working with public sector agencies, federal, state and local to make sure that we meet their needs and their challenges.”

DoD will need to rely on future technologies as 5G develops to better security measures.

Schromsky said quantum computing will have a large role to play in encryption and not only making military bases safe, but also securing financial transactions and personal data.

Defining the Smart Base of the Future

In essence, a smart base of the future is the integration of connected technologies that will fundamentally improve the performance and efficiency of assets and services across a military installation.

Opportunities and Challenges with 5G

For military personnel living on the base, they'll be able to take an autonomous shuttle or bus that will get them around, all that cool stuff. More importantly they’ll see it in connection. If they're using a video calling platform on these connected devices, they'll just have a much better experience. They'll be in high definition video, they'll have faster bit rates, so they can do more videos. They can upload information, all of that good stuff and couldn't ask for much more to have a very intimate and immersive experience in a personal way.

Listen to the full show:

]]>
https://federalnewsnetwork.com/federal-insights/2022/02/federal-insights-smart-base-of-the-future/feed/ 0
How the Air Force manages risk in its supply chains https://federalnewsnetwork.com/defense-main/2021/10/how-the-air-force-manages-risk-in-its-supply-chains/ https://federalnewsnetwork.com/defense-main/2021/10/how-the-air-force-manages-risk-in-its-supply-chains/#respond Wed, 27 Oct 2021 19:50:52 +0000 https://federalnewsnetwork.com/?p=3728514 When a part is needed for a plane in the Air Force, it usually comes through the 448th Supply Chain Wing in the Air Force Sustainment Center.

Ensuring that all those parts get to their destinations, and that they are not tampered with or sabotaged, calls for a good deal of risk management.

Stephen Gray, director of the 448th Supply Chain Wing, says that risk management has been an ever changing and challenging problem as the United States tackles supply shortages, a pandemic and tries to lessen its reliance on Chinese goods.

“We’ve invested heavily in supply chain risk management capabilities, to where we can trace our supply chains down to the third, fourth or even fifth tier supplier,” Gray said on Federal Insights: Supply Chain. “We know where our parts are coming from. We are able to identify where we were going to see a risk. Now that could be a risk due to a weather event, a cyber attack, a financial problem within a supplier. We’ve even been able to map out those areas more to understand what type of disruptions could occur in our supply chain.”

Using that information, Gray said, the Air Force is able to prioritize high-risk suppliers and commodities. If the service needs to, it can develop a new supplier or continue working with one that is having issues. He tipped his hat to Congress for providing a stimulus package that allows the United States to do that.

While the Air Force has decent transparency into its supply chain and into its suppliers, there are still challenges it faces when it comes to delivering the goods and doing it on time and at a fair cost.

“We’re going to see inflation across the board as a result of the disruptions that we’re seeing in the pandemic,” Gray said. “Anything we do to reassure work will drive some inflation into the market, as well. The defense budget is closely scrutinized and there’s only so much we’re able to do to cope with large inflation. I think that’s one area that we’re very sensitive to and trying to watch.”

Gray said his office is working with companies on ways to keep the prices of goods from growing too out of hand.

Another issue is the continuing pains from the COVID-19 pandemic, which are still keeping goods at ports and causing long waits for some products.

“The next pandemic, or preparing for that, is really one of the things that’s on our mind as well,” Gray said. “How do we prepare ourselves, protect ourselves and be able to operate in either a continuant to the current pandemic or the next one that comes down the road. We just think from our businesses readiness perspective and being prepared for disruptions, that we need to keep our eye on the ball. We think the lessons learned that we found in the COVID-19 pandemic will serve us well for the next one.”

]]>
https://federalnewsnetwork.com/defense-main/2021/10/how-the-air-force-manages-risk-in-its-supply-chains/feed/ 0
How the DLA pivoted to work supply chains during COVID https://federalnewsnetwork.com/defense-main/2021/10/how-the-dla-pivoted-to-work-supply-chains-during-covid/ https://federalnewsnetwork.com/defense-main/2021/10/how-the-dla-pivoted-to-work-supply-chains-during-covid/#respond Wed, 20 Oct 2021 15:56:50 +0000 https://federalnewsnetwork.com/?p=3717542 The Defense Logistics Agency is in charge of moving $40 billion worth of goods around the world per year, but when COVID hit and supply chains started moving in fits and starts, the organization had to start changing to get goods delivered on time.

DLA works on delivering everything from food to fuel generators to respirators, all important components of the supply chain during COVID.

“When COVID came along, initially, the early efforts were really focused on the operational support,” Rear Adm. Doug Noble, director of logistics operations for DLA, said on Federal Insights: Supply Chain. “From a DLA perspective that quickly evolved to things like personal protective equipment and ventilators. We’ve done over 325 million test kits, over 5 billion sets of gloves, 70 million N-95 masks, almost 300 million surgical masks, and 200 million gowns and just shy of 6,000 ventilators.”

Noble said DLA worked closely with the Federal Emergency Management Agency to find ways to deliver quickly in emergency situations.

“We quickly found ourselves evolving into a role of providing support to FEMA for their mission assignments, but also maturing our relationship with [the Department of] Health and Human Services to provide those items,” Noble said. “As time progressed a little bit further, we were able to get the supply chain to the point where we could start working on restocking the Strategic National Stockpile with all that critical PPE. As a result of working through that process, we have agreements now with HHS where we’ve done about $5 billion to replenish that Strategic National Stockpile.”

Noble said there is still about $7 billion more that needs to be restocked.

As DLA and the nation as a whole builds back its supply chains, Noble said the Defense Department is rethinking its security.

Reliance on China, Russia and other countries, as well as threats from terrorism and extreme weather can all disrupt critical goods from getting to their destinations.

Noble said DLA is working to identify what are the most critical supply chains so they can be protected in the future.

“Can you sift through all that data and that information to know where you really need to prioritize? And to identify those critical items, we engage through the services to pick out those items. They tell us what weapon systems are their critical key weapon systems,” Noble said. “Through the configuration management of those systems we’re able to identify what the key components and the key parts are that would render a weapon system inoperable if the parts weren’t available.”

DLA is also working on the ability to surge a supply chain when needed. Noble said that DLA is partnering with companies to increase supplies even when they aren’t in short supply at the time.

“We have a program that we call our war stopper program, where we address critical needs across multiple supply chains that normally it wouldn’t be economically feasible or supportable to stock all these items because they’re the ‘just-in-case’ items,” he said. “It just would be cost prohibitive to try and maintain all that capability all the time. But through innovative contracting strategies with vendors, we do things like paying a small insurance premium to reserve capacity capability so that we can surge.”

]]>
https://federalnewsnetwork.com/defense-main/2021/10/how-the-dla-pivoted-to-work-supply-chains-during-covid/feed/ 0
COVID has flipped what government and companies expect from supply and demand, now they are preparing https://federalnewsnetwork.com/federal-insights/2021/10/covid-has-flipped-what-government-and-companies-expect-from-supply-and-demand-now-they-are-preparing/ https://federalnewsnetwork.com/federal-insights/2021/10/covid-has-flipped-what-government-and-companies-expect-from-supply-and-demand-now-they-are-preparing/#respond Tue, 12 Oct 2021 10:30:31 +0000 https://federalnewsnetwork.com/?p=3703026 The words “supply chain” were barely in the public lexicon 10 years ago. Now it’s a top-of-mind concern for industry and government agencies alike.

COVID-19 turned the world on its head, and along with it some of the everyday assumptions about what supply, demand and Adam Smith’s invisible hand could handle.

Now, the government and companies are rethinking some of the ways they do business to ensure products are delivered on time and shortages become a thing of the past.

The Biden administration issued an executive order early in the presidency to tackle supply chain issues.

“The key theme, and really everything that’s coming out of the executive order is about resiliency,” said Christine Barnhart, senior director at Infor for supply chain strategy, during a Federal Insights discussion sponsored by Infor. “The order is about how we make our supply chains less fragile and more resilient. It’s not so much about isolating us from the rest of the world, but really taking out some of the risks, and making sure that we’re able to be self-sustaining at least for a period of time.”

The United States is especially focusing on some important and fragile supply chains like electric vehicle batteries, semi-conductors, microchips and rare earth minerals.

The executive order created a task force on the issue and prompted a 100-day review.

“The United States needs resilient, diverse and secure supply chains to ensure our economic prosperity and national security,” the order states. “Pandemics and other biological threats, cyber-attacks, climate shocks and extreme weather events, terrorist attacks, geopolitical and economic competition, and other conditions can reduce critical manufacturing capacity and the availability and integrity of critical goods, products and services. Resilient American supply chains will revitalize and rebuild domestic manufacturing capacity, maintain America’s competitive edge in research and development, and create well-paying jobs.”

“What that means for private companies is working with the government and understanding their own processes better,” Barnhart said.

“The government needs to incentivize companies to make investments for better resiliency, better transparency and more communication,” she said. “We have these multi-enterprise business networks and they haven’t been well-adopted across supply chain, generally on a  global nature. What we’ve seen as a result of just COVID, in general, but then in government policies, and even in government research, is a real push to build better commerce networks and collaboration networks. I think you’ll see a lot more investment in that area.”

Barnhart said companies are changing their risk framework after seeing how fragile supply chains were.

“We’ve seen a ton of companies looking at, maybe not completely onshoring, but nearshoring,” she said. “So bringing their supply chains a little closer to the point of consumption. We’ve seen a ton of activity in supplier management and supplier collaboration tools. I think the other big area is really around the demand side as in demand planning, demand sensing. The demand patterns of the past have been just completely kind of thrown out the window.”

The Executive Order and Supply Chain Management

The key theme, and really everything that's coming out of the executive order is about resiliency. The order is about how we make our supply chains less fragile and more resilient. It's not so much about isolating us from the rest of the world, but really taking out some of the risks, and making sure that we're able to be self-sustaining at least for a period of time.

Supply Chain and the Pandemic

The government needs to incentivize companies to make investments for better resiliency, better transparency and more communication. What we've seen as a result of just COVID, in general, but then in government policies, and even in government research, is a real push to build better commerce networks and collaboration networks. I think you'll see a lot more investment in that area.

Listen to the full show:

 

]]>
https://federalnewsnetwork.com/federal-insights/2021/10/covid-has-flipped-what-government-and-companies-expect-from-supply-and-demand-now-they-are-preparing/feed/ 0
How the Army is bouncing its supply chain back from the COVID freeze https://federalnewsnetwork.com/army/2021/10/how-the-army-is-bouncing-its-supply-chain-back-from-the-covid-freeze/ https://federalnewsnetwork.com/army/2021/10/how-the-army-is-bouncing-its-supply-chain-back-from-the-covid-freeze/#respond Wed, 06 Oct 2021 12:16:32 +0000 https://federalnewsnetwork.com/?p=3694098 The past 18 months have thrown a wrench in many of the military’s plans, but the supply chain took a particularly hard hit.

Deacon Maddox, Army Materiel Command director of supply chain management, said there were a handful of backups that caused issues for the supply chain during the height of the coronavirus pandemic.

“When you’re on a multi-year buy plan, you’ve got a lot of capital sunk in contracts and then you’re counting on sales to occur,” Maddox said on Federal Insights: Supply Chain. “If people aren’t training or leaving the motor pool because they’re doing other things, or they’re isolating or they’re sick or they’re not going on deployments then what you have is the demands that you are counting on that don’t materialize.”

The Army’s Materiel Command uses a working capital fund in order to procure and provide materiel and commercial products and services to its forces. The service ended up needing an injection of hundreds of millions of dollars to keep the fund solvent this year.

“We honestly didn’t think we needed a cash infusion into the working capital fund [back in March] to offset health and safety leave and the increased cost of health and safety leave as others worked longer hours and overtime to keep production running,” Army Lt. Gen. Duane Gamble, deputy chief of staff for logistics, told the House Armed Services Readiness Subcommittee in March.

The Army is back on track now with its capital fund and focusing on keeping the supply chain humming so that soldiers can continue to repair and sustain their systems into the future.

Maddox said the Army had to revise its sustainment plans after taking a hit from COVID, but the service is taking care to plan for other contingencies as well.

“There’s a lot of things that can happen, climate change is something that we’re watching very closely. That could cause interruptions,” he said. “I think the overarching theme is that we’re not alone in this. We need to think about how do we make our supply chains more resilient? You’re never going to make it completely fireproof against whatever is out there. But how do you minimize the impacts of things like climate change, or pandemics or cyber attacks or anything else?”

Maddox said one of the ways the Army is ensuring more stability is through advanced technologies. The service is investing in centers of excellence and partnerships for things like additive manufacturing and 3D printing.

“One of the areas that we’re really looking closely as we modernize is getting in on the ground level with regard to our tech data, and making that tech data available, either through a digital twin type of situation or digital twin arrangement or when this system is fielded,” Maddox said. “We get those rights, we get that that intellectual property upfront so that we can then convert that into the future capabilities. This technology looking at forward into the future. How do we shape our doctrine? How do we shape policy? How do we shape our weapons systems to do these kinds of things, and the Army is well on its way to having this as a as a capability.”

]]>
https://federalnewsnetwork.com/army/2021/10/how-the-army-is-bouncing-its-supply-chain-back-from-the-covid-freeze/feed/ 0
To your DevOps and CloudOps practices, add FinOps https://federalnewsnetwork.com/federal-insights/2021/10/to-your-devops-and-cloudops-practices-add-finops/ https://federalnewsnetwork.com/federal-insights/2021/10/to-your-devops-and-cloudops-practices-add-finops/#respond Tue, 05 Oct 2021 20:34:11 +0000 https://federalnewsnetwork.com/?p=3693981 Cloud computing has moved from an idea, to an experiment, to a mainstream practice for federal IT departments and the programs they support. Often, the focus for cloud adoption has landed on the technical requirements – application rationing, code updating, network requirements, to name a few.

Now it’s time to take a more comprehensive organization approach to cloud by incorporating detailed financial planning for the cloud. Brian Reynolds, principal for the Digital Transformation and Management Practice at Grant Thornton, calls this approach FinOps, a play on the DevOps principle in which technical, program and financial people work together continuously to ensure the agency gets the best return on the dollars it invests in commercial clouds.

Costs alone are important, to be sure. They’re “absolutely a core element of ROI and understanding whether or not moving to the cloud, in fact, makes good business sense,” Reynolds said.

But direct costs are only part of the calculation. More broadly, he said, agencies need to fully understand the motivations for cloud computing to begin with.

“Those motivations can be cost savings related, but they could also be oriented around the need to exit the data center, the need to improve business agility, the need to transition from end-of-life or end-of-support for [existing] technology,” Reynolds said. He advises taking time to get all of the parties together ahead of a cloud migration. Then define the motivations and specify the metrics for the outcomes they’re seeking. The team, he added, should include program managers, infrastructure and operations people, application developers, and ideally, people from agency finance.

The motivation can help point up which applications to move sooner rather than later, Reynolds said. If cost savings is the main motivation, then analysis is like to who it’s wiser to start with non-mission-critical applications. That approach can move the organization up the cloud maturity scale initially.

Conversely, he said, “if we have a business imperative, or a market need or some constituent need that requires mission critical applications to be adjusted or improved, then those become our priority.”

Reynolds said organizations using technology business management, or TBM, have a better chance of achieving the holistic approach that brings in both technology and finance. TBM, he said, gives the organization a finance view, an IT infrastructure view, and a workload-oriented view which, combined, help drive stronger cloud decisions.

TBM, he added, ensure all of the costs are visible, including not just direct cloud charges but also the costs of migration, facilities, and labor – both internal and external.

“These costs are all embedded within your cloud service costs,” Reynolds said. “And so when we think about doing a comparison, we have to make sure we really do consider not just the obvious hardware, software licensing sorts of costs. We also have to think about some of those costs that might be less obvious. TBM can be a way to think about and get a handle on those costs very quickly.”

Cloud Migration Strategy and Cloud FinOps

Cloud migration is absolutely a strategy for tech debt remediation…The code base and the tech stack depreciation are certainly key to making those kinds of decisions.

Costs of Cloud Performance

I think [with] FinAps on the cloud side, we're looking not just to be informed about our costs, but to then take steps to optimize those costs, and then ultimately, to operate and continually improve those costs.

Listen to the full show:

 

]]>
https://federalnewsnetwork.com/federal-insights/2021/10/to-your-devops-and-cloudops-practices-add-finops/feed/ 0
Use the cloud for frictionless identity and access management https://federalnewsnetwork.com/federal-insights/2021/10/use-the-cloud-for-frictionless-identity-and-access-management/ https://federalnewsnetwork.com/federal-insights/2021/10/use-the-cloud-for-frictionless-identity-and-access-management/#respond Mon, 04 Oct 2021 17:49:35 +0000 https://federalnewsnetwork.com/?p=3691913 Whether it’s zero trust specifically for improving cybersecurity, generally agencies need solid and up-to-date identity and access management systems. So-called IDAM systems should incorporate two-factor authentication, incorporate cloud connections for applications hosted off-premises, and allow for secure, single sign-on so as not to make life difficult for end users.

“Over the last, you know, five or ten years, we’ve really thought about identity access management as more of a security construct,” said Sean Frazier, the federal chief security officer at Okta. “But it’s also a usability construct. We have to provide good user experiences so that users, when they log into something, it’s pretty seamless.”

Getting ID and access management right is important for several reasons. Frazier pointed out that the ID and access management “plane” in systems is an attractive place for attackers to gain access to networks and data. That in turn is one reason by current federal policy requires agencies to have specific technical strategies in place for ensuring the identity of people using federal networks.

With growing numbers of applications and databases moving to commercial cloud hosting, Frazier said it’s wise for the ID and access management plane to locate there too. With large percentages of federal employees continuing to work from home because of the pandemic, cloud became an even larger factor.

What about Active Directory or similar services that exist on premise?

“A lot of organizations who have deployed on-prem identity solutions and legacy solutions, like Active Directory, can extend that to the cloud,” Frasier said. “Okta does a really good job of extending that and ‘cloudifying’ the identity and access management, leveraging that repository.”

Cloud ID and access management platforms, he added, can also leverage other databases, such as human resources, as a “source of truth.” A second benefit after enabling secure access, Frazier said, is how cloud solutions can reduce the friction of onboarding new employees and ensuring secure remove of people who leave the agency.

Frazier said a key benefit of cloud computing extends to ID and access management implementations. Namely, the cloud takes care of patching and otherwise updating applications hosted there. Okta partners with Amazon Web Service to host its platform. With respect to server capacity expansions or updates and patches that can tax an IT organization, “they’ve already built and automated all of this capability, including the patching and security infrastructure for what they deliver. So it allows organizations and agencies to focus on what they do for a living, which is their users and their data.”

Current Identity Management Best Practices

You need to start with secure single sign-on with multi factor authentication as the core tenant of the identity stack.

Benefits of Cloud

It really doesn't make any sense to run your identity platform not in the cloud, because everything's in the cloud. If you do have some on-resources, a cloud identity solution can protect that securely as well.

Listen to the full show:

]]>
https://federalnewsnetwork.com/federal-insights/2021/10/use-the-cloud-for-frictionless-identity-and-access-management/feed/ 0
How to strengthen cyber identities and protect network privileges https://federalnewsnetwork.com/federal-insights/2021/07/how-to-strengthen-cyber-identities-and-protect-network-privileges/ https://federalnewsnetwork.com/federal-insights/2021/07/how-to-strengthen-cyber-identities-and-protect-network-privileges/#respond Mon, 26 Jul 2021 12:47:03 +0000 https://federalnewsnetwork.com/?p=3580048 End points – smart phones, tablets, notebook PCs, even desktop PCs – form employees’ entry points to networks, applications and data. The rise in remote working and teleworking has made end points particularly attractive to cyber hackers, especially using ransomware attacks delivered through phishing campaigns.

According to Bryan Murphy, the senior director for consulting services at CyberArk, that situation calls for systems of continuous verification and certification of users’ IDs throughout sessions, as users navigate across networks. This must occur in the context of zero trust, “meaning that we’re going to verify everything that you do, as you do it, instead of just giving you standing access to all the systems just because you’re an employee or a contractor within an organization.”

Making the zero trust architecture operative requires use of multi-factor authentication (MFA), Murphy said. This notion is expressed explicitly in the recent White House executive order on cybersecurity, so in a sense it is policy anyhow.

But in strengthening its approach to managing cyber identities and network privileges, an agency risks spoiling the user experience. Having users constantly re-enter credentials might be today’s paradigm, Murphy said, but it doesn’t have to be that way. With CyberArk’s platform, users can login with their passwords and second authentication factors, then have several minutes to open applications for which they are approved.

“So we can start to suppress some of those alerts,” Murphy said. “But at the same time, we can also have it where, if there’s a specific application or a certain configuration where you want them to [multi-factor authenticate] every time, we add this additional approval. We can configure the system to do that as well.” The challenge could be time-related, or geography related, thereby barring spoofed identities.

“So this is where it’s not a one size fits all,” Murphy said. “We really operate in the space of keeping the security high, while also keeping the user experience high, which is very challenging to do.”

He cautions against the use of codes coming to cell phones or via e-mail, both of which can be intercepted.

“We can do this several different ways. We can do it through biometrics or we can do it through a push notification, versus a text message that can be spoofed or replicated, or an email that could be stolen. So this is where there’s different elements, or different layers of security, to the way we do MFA to these users,” Murphy said.

The detailed, technical solutions to multi-factor authentication must operate within a framework that includes an identity governance solution.

“We need to have a map with a better way to link all this together, to know when we’re going to take which actions against certain either commands that are run, tasks that are done, systems we’re going to access from, and from which users,” Murphy said.

For example, if a known identity is coming in through an external portal, that might invoke different challenge-response routines than if the same identity logs on from with the corporate network. Behavioral anomalies – someone who normally works days logging on a 2 a.m., for example – can also trigger certain responses.

“The identity governance solution should be working to identify all the accounts privileges you have within your agency,” Murphy said. “We should know what they should have access to what they shouldn’t. As people move around and change roles and responsibilities, the identity should morph with that.”

He added, “What it also does is allow you to know if an additional account is created a rogue account or something random. You can be aware of that this is potentially something we need to investigate further, because that doesn’t follow our governance process.”

The security framework also encompasses the notion of non-repudiation, Murphy said. It means “whatever is done with an account or on a system, we can tie it back to a user, and the user has no way to say that wasn’t me.” Non-repudiation is enabled with strong ID enrollments and strong authentication.

Cloud hosted identity and authentication solutions can enhance security, Murphy said.

“The cloud has a lot of benefits to it around with the security, the configuration, the learning that we get from the data,” he said. The downside is that the agency may not have direct control over the network pathways to and from multiple clouds, nor have the visibility it has into its own network. Mitigating in favor of the cloud is the assurance that required patching and other upgrades will take place without requiring agency intervention.

Identity Management via Endpoints

What it comes back to is that user experience, and how do you make sure you're authenticating the user when you need to. And if you need them to interact, to approve something, you're doing it at the time that that they need, and not forcing them to do it at every step.

Evaluating ID Management Solutions

We tell our customers that work with us at CyberArk, it’s like a slider: You can slide it either towards security or towards operational efficiency, or we'll say user experience. It's not you can slide the both of them up simultaneously.

Listen to the full show: 

]]>
https://federalnewsnetwork.com/federal-insights/2021/07/how-to-strengthen-cyber-identities-and-protect-network-privileges/feed/ 0
Telemedicine with 5G could be a gamechanger for military health https://federalnewsnetwork.com/federal-insights/2021/07/telemedicine-with-5g-could-be-a-gamechanger-for-military-health/ https://federalnewsnetwork.com/federal-insights/2021/07/telemedicine-with-5g-could-be-a-gamechanger-for-military-health/#respond Thu, 22 Jul 2021 17:02:52 +0000 https://federalnewsnetwork.com/?p=3573044

Telehealth became an even bigger industry during COVID-19. Doctors were forces to think of creative ways to see patients as people were forced to stay home to avoid the spread of the virus.

However, as 5G is starting to roll out, telehealth may be breaking into a completely new plane. At Joint Base San Antonio (JBSA) the Air Force is testing capabilities that could be the future of medicine.

“5G brings a whole new paradigm and architecture to the table. From what we’ve seen before even up through the current 5G  non-standalone that you see advertised on TV today,” Jody Little, executive program manager for 5G NextGen at JBSA, said during a Federal Insights discussion sponsored by Verizon. “Now you can bring large amounts of data forward or back to it and operate in the forward edge. You can virtualize these applications and get very ultra-low latency. And now you’re supporting lots of sensors. Whereas in, say, 4G, you could support maybe 100. Here, you can support 1000s.”

That means that doctors have the opportunity to monitor patients like never before. Doctors across the country can sit in on surgeries and experience them as if it were almost in-person by looking at multiple sensors and using virtual reality.

“If you think about advances gaming, and you think about 5G telemedical applications, you’ll be able to push those forwards so they can operate in real time, see the data, whether that’s a virtual reality application or augmented reality application and it’ll be in real time whether interacting with the patients,” Little said. “5G promises significant improvement in the ability for the Defense Department to support critical care at the point of care. What that means is that they’ll be able to provide more care in real time and bring more expertise to the table than they currently are.”

Doctors will be able to layer models on top of patients and be guided by experts from anywhere in the world.

Putting important information like medical sensors and personal information in the network creates new challenges for security.

Little said patient security is of utmost importance and DoD is going beyond traditional security measures to protect it.

“The international body that manages the 5G standards has some very good 5G security standards,” Little said. “From a DoD perspective, we need more and we need better in some cases. Our perspectives run from the user equipment all the way back to the cloud or from the cloud all the way up to the user equipment. The whole capability from end-to-end needs to be operate in a secure zero trust architecture.”

Little said it also takes some work on the part of users as well to ensure networks remain safe.

“What we’re going to have to see is the architecture, and the interfaces support better cybersecurity,” Little said. “We can’t allow things to happen that would cause malicious activities within the 5G network or the applications that are running. it’s really difficult to change the users’ habits. But, it has to start early and continue through and we see that all the time in training.”

]]>
https://federalnewsnetwork.com/federal-insights/2021/07/telemedicine-with-5g-could-be-a-gamechanger-for-military-health/feed/ 0
Army is testing multiple avenues to implement 5G on bases https://federalnewsnetwork.com/federal-insights/2021/07/army-is-testing-multiple-avenues-to-implement-5g-on-bases/ https://federalnewsnetwork.com/federal-insights/2021/07/army-is-testing-multiple-avenues-to-implement-5g-on-bases/#respond Wed, 14 Jul 2021 14:49:12 +0000 https://federalnewsnetwork.com/?p=3559652

The Army is embarking on a handful of 5G tests that it hopes will bring the service into the future with networking technology.

The service is partnering with industry to use 5G for telemedicine, travel and training.

“5G as a whole is nested within the overall army modernization efforts,” Doug Babb, Army G6 Senior Program Lead for 5G Experimentation and Integration, said during a Federal Insights discussion sponsored by Verizon. “Across different capability sets the Army is looking at how we invest and modernize our capabilities. This increased connectivity has potential to create greater efficiencies and greater capabilities.”

The Army is working 5G testbeds at multiple bases, along with other military services, to try 5G advancements and test security.

For example, in Fort Carson in Colorado, the Army is testing automated vehicles to “to evaluate how automated technology can enhance mission readiness, and assess the potential to reduce base operating costs, improve safety and enhance quality of life for military service members and their families, and provide transportation services more efficiently and effectively,” according to the Army Corps of Engineers.

It’s not just new technologies that soldiers have access to with 5G. The 5G networks also helped with existing technologies, especially during COVID-19.

“Things that are critical, we’re talking about personnel records, we’re talking about medical access, telemedicine, as we’ve seen through the COVID environment have to be able to connect with a provider,” Babb said. “It’s been critical, especially when we need to work on the social distancing aspect of COVID and make sure we remain as safe as possible. Telemedicine is just, it’s just a great example of the capability we’re talking about and greater access.”

Babb said the Army is looking at 5G from a holistic point of view to offer interconnectedness for soldiers in the future.

“We are we see great opportunities to enhance the command posts and command post operations by top by tying autonomous vehicles, both drone aircraft, as well as unmanned platforms and sensors,” Babb said. “Doing this integrates autonomous and robotic platforms that set the condition to provide greater protection for our soldiers.”

Of course, as more devices are connected to a network, the surface area grows making it vulnerable to attack.

Babb said cybersecurity is paramount for the Army when it comes to 5G.

“What that really means is that we’re making sure we secure networks, all types of networks, because with the 5G architecture, you’re looking at bringing in a lot of different networks together and really enhancing and creating this that security posture,” he said. “Let me use an example. Banking is one of those things that we’re all familiar with. When you log on to your bank account, the first thing they do is text you a second security password. They’re authenticating you before you get into the information that’s contained within your account. And this is along the same lines of that we’re looking at implementing these types of strategies and capabilities to enhance the security posture of our networks and our information.”

]]>
https://federalnewsnetwork.com/federal-insights/2021/07/army-is-testing-multiple-avenues-to-implement-5g-on-bases/feed/ 0
No excuses left to avoid switching to GSA’s EIS contracts and adopting software-defined networks https://federalnewsnetwork.com/federal-insights/2021/06/no-excuses-left-to-avoid-switching-to-gsas-eis-contracts-and-adopting-software-defined-networks/ https://federalnewsnetwork.com/federal-insights/2021/06/no-excuses-left-to-avoid-switching-to-gsas-eis-contracts-and-adopting-software-defined-networks/#respond Tue, 29 Jun 2021 19:36:57 +0000 https://federalnewsnetwork.com/?p=3537010 A couple of factors have come together to enable federal agencies to modernize their networks in a major way. One, Congress has appropriated substantial money for information technology modernization in general. And two, the General Services Administration took steps to ensure the latest networking technologies are available via its signature Enterprise Infrastructure Solutions (EIS) governmentwide acquisition vehicle.

According to Tony Bardo, the assistant vice president of government solutions at Hughes Networks, this presents agencies with what they need to not merely meet next year’s deadline of having all of their telecom inventory on EIS, but also to replace legacy technology with software-defined, wide area networks, or SD-WAN.

Coupled with a variety of transport media available, Bardo said, SD-WAN will better enable agencies to carry out cloud strategies, empower tele- and remote employees, and fine-tune their public-facing digital services.

Above all, Bardo cautioned in an interview with Federal News Network, now is decidedly not the time to do that he called “like-for-like” acquisitions. That is, replacing existing contracts with EIS task orders, but acquiring the same dated technology. Specifically, he named MPLS, or multiprotocol label switching, an aging and expensive way to get packets from point A to point B, as due for replacement.

SD-WAN technology is “absolutely perfect for the digital experience, for constituents to interact with the government online,” Bardo said, “because of the various access methods that SD-WAN enables the agency to deploy.” SD-WAN, Bardo said, complements the multiple transport mechanisms – such cable, satellite, fiber, even DSL – in use at agency sites and in users’ homes.

Bardo urged agencies to release their fair market opportunities, the first step in transitioning to the EIS contractors. And he said early EIS adopting agencies that continued with MPLS should pause and see if they’re able to switch to newer technology.

“Too many fair opportunities haven’t come out the door yet, that need to, to make these deadlines that GSA has imposed,” Bardo said. He added, “My concern is that the early adopter agencies might have made those awards on a like-for-like basis. Are they still going down the MPLS path? Have they put the brakes on and said, ‘Okay, let’s, let’s have discussions with the awarded vendor?’”

Current Standing of Network Modernization in Government

The shared technologies are really where you've seen the cost drivers improve so much. It's not so much that any given agency is going to spend less … but they're going to get so much more power, so much more bandwidth, so much more path diversity…and ability to build up the network.

Remote Work and Network Modernization

Agencies need to choose an EIS service provider who's not only comfortable working in the corporate environment, but also the consumer environment, because, frankly, those two worlds have merged now. All of a sudden, your location inventory, if you will, of sites has doubled or tripled, or even more.

Listen to the full show:

]]>
https://federalnewsnetwork.com/federal-insights/2021/06/no-excuses-left-to-avoid-switching-to-gsas-eis-contracts-and-adopting-software-defined-networks/feed/ 0
AI success means starting by asking, What is the problem? https://federalnewsnetwork.com/federal-insights/2021/06/ai-success-means-starting-by-asking-what-is-the-problem/ https://federalnewsnetwork.com/federal-insights/2021/06/ai-success-means-starting-by-asking-what-is-the-problem/#respond Wed, 02 Jun 2021 15:37:02 +0000 https://federalnewsnetwork.com/?p=3495381 Artificial intelligence is a powerful technology, but it’s not a magic salve you can apply to a process to make it better. If anything, AI – and the related machine learning, robotic process automation and even data analytics technologies – requires more attention than ever to an eternal basic of information technology deployment.

Specifically, success in AI and the other data-driven technologies starts with a clear and defined notion of the expected outcomes. That means the business or process owners are those most able to initiate successful AI projects because they most intimately understand the business challenges they face and the mission outcomes they must deliver.

“It’s the people who are living the problem and understand what’s going on day to day,” said Jim Smid, the chief technology officer of Iron Bow Technologies. “The IT manager, the CIO – they don’t understand what, really, those problems are. And so they don’t really understand what the outcomes need to be.”

Therefore, Smid added, the choice of specific tools and technologies should come closer to the end of a project design than it typically does. Problem definition and selection of data sources that will enable algorithms to produce expected outcomes must come earlier.

In fact, choosing a tool too soon can limit the power of a project. For example, choosing to automate a time-intensive process with an RPA tool might mean you overlook the opportunity for a more comprehensive solution involving API’s and programming. AI can come from a variety of tools and solutions.

“RPA might be a good thing for prototyping things,” Smid said. “That also might be a good thing long term. You don’t really know until you understand a little bit more about the data, and the process and what’s all going to be involved.”

Smid cited one client. “We do a lot with the VA. We’re not talking to the IT folks, we’re talking to the clinicians about what the day in their life of looks like.”

Having a view of the problem from that standpoint can enable the IT people better understand the totality of technologies that might be brought to bear on a problem. For example, the clinical need for reliable and easy-to-use remote sessions raises not only automation questions related to data gathering about patients and building trend analyses, but also about technologies such as 5G to reduce latency or the need for more intuitive user portal design. Perhaps better image processing with improved cameras can produce more and higher-quality data for applying AI to pathologies with visible symptons.

Smid cautions against the common mistake of thinking too grandiosely. In the defense domain, for example, often what operators really want is solutions to day-to-day problems that sap people’s time and attention, rather than an algorithm to bring world peace.

For example, the predictive analytical approach to scheduled parts replacement that has taken hold in the industrial sector can adapt to military logistics and sustainment. Smid cited the now common practice of replacing disk drives before they fail.

“That was game changing,’ Smid said. “To be able to do that with your helicopters, your tanks, your jeeps – all that predictive maintenance is a critical piece. It’s where a lot of resources are going to get applied now, because there’s such a big impact.”

Smid also advises agencies to think about the computing architectures best suited for the AI and related applications they do deploy. He said the trend for internet-of-things data pushes processing – the running of the AI applications – to take place at the edge, where the data is gathered. Often that’s where users actually need outcomes, and it saves the costs and complexities of shipping data to and from data centers or commercial clouds.

Definitions of Artificial Intelligence and Machine Learning

Most IoT projects we talk to people about, it's not about the tools themselves. I can start with a tool but not everything is a nail. So I don't always need a hammer. It's important to really look at the business owner, or the mission owner.

Automation Use Cases

The reason for pushing compute out to the edges? Moving data can be expensive. I don't want to move all the data all the time; I want to move the data that's important. Or maybe I just want to move my outcomes to a centralized location. So managing that data, and making sure you're crunching it in the right places, are really important.

Listen to the full show:

]]>
https://federalnewsnetwork.com/federal-insights/2021/06/ai-success-means-starting-by-asking-what-is-the-problem/feed/ 0
How to best leverage your hybrid computing environment for efficiency and effectiveness in AI https://federalnewsnetwork.com/federal-insights/2021/05/how-to-best-leverage-your-hybrid-computing-environment-for-efficiency-and-effectiveness-in-ai/ https://federalnewsnetwork.com/federal-insights/2021/05/how-to-best-leverage-your-hybrid-computing-environment-for-efficiency-and-effectiveness-in-ai/#respond Thu, 27 May 2021 21:14:07 +0000 https://federalnewsnetwork.com/?p=3486347 Managing artificial and machine learning application projects is in large measure a matter of fine-tuning the locations where data and applications reside. It’s less a matter of data center versus cloud, than of portability among data center, cloud and edge computing, consistent with optimal availability and cost control. Therefore, it’s important for agencies to spend some effort planning the infrastructures for systems hosting AI development and training data, as well as for deployable AI applications.

In the cloud era, this management requirement extends to the commercial clouds agencies employ as components in their hybrid computing environments. With contemporary approaches to storage tiers, application hosting decisions, and locating and updating of the agency’s own data centers, the IT staff can find efficiencies that enable AI development in a cost effective way.

For some of the latest thinking, Federal News Network spoke to Nic Perez, the chief technology officer for cloud practice at ThunderCat Technology; and Al Ford, the artificial intelligence alliances manager at Dell Technologies.

Perez said that AI application development that used to require agency-operated high performance computing environments and the associated software tooling is all finding its way into commercial clouds.

“One of the benefits of the cloud is that agencies can leverage the compute and the power that is available inside these cloud providers,” Perez said. “Move your data, and then absolutely maximize the compute power there.”

Different clouds offer differing toolsets, he added, giving agency practitioners flexibility in the degree of automation they want in staging and training AI applications. Perez said that over the last year or so, he’s seen a “land rush” of agencies moving text analytics, speech, and video data to the cloud, and performing AI on the data there.

In other instances, Ford said, it may make sense to train and deploy artificial intelligence applications neither in the cloud nor in a data center, but rather in an edge computing environment.

For example, “it could be that you’re part of the geological survey, and you’ve got a vehicle carrying a camera, and you need to access that vehicle. So the edge literally could be out, field-deployed,” Ford said. Trucks, aircraft, Humvees – all can serve as edge computing locations. He said hyper-converged and containerized workloads are easily movable among computing resources located where you gather data. In such cases, Ford said, the agency may find it advantageous to add software stacks to the cloud, from which they can communicate to the edge, where artificial intelligence is occurring.

Otherwise, Ford said, applications running large data sets in edge resources often benefit from adding local GPU accelerators. These enhance performance, while helping the agency avoid some of the costs associated with moving large data volumes and workloads in and out of commercial clouds.

Agencies may find that with this approach, they may only need to transfer across their networks the output of an application, the decision-making result of AI. The data, application, and compute stays local.

Still another option is having a vendor own and operate a replication of an agency’s data center, but in a secure, “caged” facility maintained by the vendor. Advantages include having geographically strategic compute power without the need for a data center-sized capital investment.

“You’re using the same equipment, the same technology, the same education and investment you’ve had for a number of years,” Perez said. “You’re just now moving into the next stage and being able to do it faster and quicker.” And on a predictable consumption-cost model.

Perez and Ford said it’s important to distinguish between the training period of AI and the deployment, in terms of the best location and architecture. Each may require a different computing set-up for maximum efficiency. Training is generally more efficient in the cloud, whereas deployment often requires a federated approach.

“Effectively, federated is, rather than bringing the data from the edge back to you, why not send those analytics in that virtualized container to the edge so that you’re not moving the data,” Ford said. “And then, once you have the results computed at the edge, you only send back the results.”

An Agency Approach to AI and Infrastructure

Data Center Extraction

Rather than bringing the data from the edge back to you, why not send those analytics in virtualized containers to the edge so that you're not moving the data. And then once you have the results computed at the edge, you only send back the results. You're lowering the bandwidth, decreasing the amount of data that has to traverse all of those networking hops.

Listen to the full show:

]]>
https://federalnewsnetwork.com/federal-insights/2021/05/how-to-best-leverage-your-hybrid-computing-environment-for-efficiency-and-effectiveness-in-ai/feed/ 0
How agencies can benefit from good hackers https://federalnewsnetwork.com/federal-insights/2021/05/how-agencies-can-benefit-from-good-hackers/ https://federalnewsnetwork.com/federal-insights/2021/05/how-agencies-can-benefit-from-good-hackers/#respond Wed, 05 May 2021 14:19:17 +0000 https://federalnewsnetwork.com/?p=3449903 The Defense Department launched the first Hack the Pentagon in spring 2016—five years ago.

The success of this crowdsourced approach to cybersecurity led to DoD establishing its first vulnerability disclosure policy, which created a safe, secure and legal avenue for private citizens worldwide to report vulnerabilities found on public-facing DoD websites and applications.

It also serves as a bridge between the DoD and security researcher community to work openly and in good faith together to identify and disclose vulnerabilities. DoD has held 14 public and 10 private bug bounty programs and paid out hundreds of thousands of dollars in response to private sector experts finding problems.

These efforts have led to the expansion of this good guy hacker approach to cybersecurity across the military services and into the federal civilian world.

From bug bounty programs to vulnerability disclosure program or VDPs, agencies are seeing the value of these approaches to securing networks and applications.

Alex Rice, the co-founder and chief technology officer at HackerOne, said over the last five years, DoD has identified more than 10,000 vulnerabilities through bug bounty and other similar programs.

“A large percentage of them are from the Defense industrial base,” Rice said during the Federal Insights: Understanding the Versatility of Crowdsourced Cybersecurity sponsored by HackerOne. “If you’re asking the public to assess your attack surface, it’s unheard of these days that it doesn’t involve some element of the of the supply chain. So we try to structure these programs so that the good hackers behave like the bad guys. Just a little over a month ago, the Department of Defense expanded their vulnerability disclosure policy to cover the DIB as well. So now they’ve explicitly set up a structure where anyone in the DIB can opt in to the DoD’s overall vulnerability disclosure policy. Within the first week, there were over 50 participants and a few dozen vulnerabilities identified in the DoD’s supply chain. So it goes to show that this this approach works across the most diverse attack services out there.”

Rice said the VDP program for defense contractors offers a path to not only find problems, but gives them a safe way to report those issues.

The VDP, bug bounty and other similar approaches aren’t just limited to big networks or systems. Rice said they can be used for everything from cloud instances to DevSecOps development to public facing websites.

“Rather than filling up good hackers’ time with the usual fluffy reports that we’re used to in those in those types of engagements, they’re paid if they actually are able to demonstrate impact. What that means is we ended up recruiting a very diverse and very talented group of folks to perform in these types of engagements. We’re able to get specialists that are specialized in one particular piece of it, that might not be able to participate to run a classic assessment against it,” he said. “You can have a very refined best of the best talent engaged in a pay-for-performance model that just compliments the existing structure you’ve been doing, but always delivers something that those traditional approaches missed.”

The benefits of using outside experts are many, including finding blind spots in your network and taking advantage of a broader knowledge base that comes from relying on these experts.

“If you’re at the stage where you’re deliberately thinking about how do I augment my cybersecurity program with hackers and rewarding them for it, there’s a few things to keep in mind as you go about it,” Rice said. “First things first, hackers are going to find things. Nobody runs these programs and doesn’t learn something they didn’t know beforehand. You want to make sure you’re in a position to be able to action those findings, typically into a vulnerability management program. But you want to think about what are all of the things that we’re asking hackers to look for, and what are we going to do about it once they find it. That means making very close allies with your vulnerability management and incident response practices before you before you kick this off.”

Rice said there are two basic types of program. One is a point and time program that is similar to a penetration test and security assessment.

The second type is for more mature organizations that are looking for more feedback.

“The most powerful aspect of these types of programs is the ability to evolve them into a continuous security testing program. These are the type of bounty programs that you see, folks like Google and Facebook pioneering, Microsoft and Amazon are huge proponents of them as well, at this point. You’re starting to see more traditional enterprises embrace them as well like Goldman Sachs and General Motors,” he said. “They’re meant to find and explore new attack surfaces fast and continually give security feedback into your security programs.”

Rice added incorporating the continuous feedback loop and responding to it will create stronger, more resilient networks and systems.

Good Hackers

It's a diverse community of folks from various backgrounds that participate in finding security flaws in software in incentivized manners. For example, we ran a one of the early bounty programs with the Hack the Air Force and we had about 130 participants in it. The top participant from that program was a sophomore in high school at that period of time was perfectly capable of hacking the Air Force as he demonstrated, but normally wouldn't be given that opportunity in a traditional vetted environment.

Incorporating Hackers into a Security Program

First things first, hackers are going to find things. Nobody runs these programs and doesn't learn something they didn't know beforehand. You want to make sure you're in a position to be able to action those findings, typically into a vulnerability management program. But you want to think about what are all of the things that we're asking hackers to look for, and what are we going to do about it once they find it. That means making very close allies with your vulnerability management and incident response practices before you before you kick this off.

Listen to the full show:

]]>
https://federalnewsnetwork.com/federal-insights/2021/05/how-agencies-can-benefit-from-good-hackers/feed/ 0
Digital modernization starts with the mission, not the digits https://federalnewsnetwork.com/federal-insights/2021/04/digital-modernization-starts-with-the-mission-not-the-digits/ https://federalnewsnetwork.com/federal-insights/2021/04/digital-modernization-starts-with-the-mission-not-the-digits/#respond Mon, 05 Apr 2021 18:02:33 +0000 https://federalnewsnetwork.com/?p=3402163 Technology refresh, business process improvement, and improving services delivered to customers and constituencies – they’re all inputs to the ongoing modernization efforts at federal agencies. Agencies caught in the technical debt of outdated technologies are finding the soundest approach to modernization is not simply to upgrade technology for technology’s sake, but rather to approach modernization from the outside in. It means looking at the desired service outcome, than reworking process behind the scenes to enable the service, then choosing the best technology to support the process.

Gwen Cadieux, the director of Enterprise IT and Managed Services at GDIT, is a strong proponent of that approach, having seen it succeed across a variety of agencies.

The outside in approach also lets agencies manage another crucial element, namely the budget.

“I think one of the biggest issues is the budgets, not only the size of the budgets, but the budget cycles, and being able to fit the amount of modernization that takes place, or needs to take place with some of our enterprises, and within those budgets,” Cadieux said.

She said that by taking a technology-first approach to modernization, agencies often risk siphoning off dollars unnecessarily by acquiring redundant products. Cadieux cited one agency that had purchased seven network management tools, each for a different segment of the network, each needing its own maintenance and upgrade costs. She cautioned against the “shiny object syndrome” when presented with new products. Better to ask, “Okay, that’s great, but what problem are you trying to solve with that? Do you really need that right now?”

Another risk of adding technology is that it can interfere with technical dependencies within or among complex systems. Cadieux said an integrator like GDIT, with its constellation of best-of-class technology partners, can help an agency strategize technology modernization, taking into account dependencies and refresh cycles. Ideally, this happens within the context of the mission delivery plans that should be driving the product acquisitions.

“We have tools that help us identify those dependencies and keep track of those dependencies as the baseline continues to change,” Cadieux said. For example, she added, “You don’t just start ripping out routers and putting in software defined network routers. You have to plan that out. You have to understand what the implications are across your enterprise.”

Cadieux said agencies are finding success in the incremental but continuous approach to digital modernization. That means determining a single, mission or service delivery requirement. Then designing the technical means to achieve it. Surprisingly, she said, this can deliver more improvement faster than the outdated grand design, big bang, or waterfall development approaches of the past.

“The true driver is the speed of change, and how quickly you need to get new capabilities, new technologies into an architecture,” Cadieux said. “It’s an iterative process. You don’t really have the time to wait until you get all the way to the end.” In this manner, customers can see progress while offering feedback on the next round of capabilities needed. In other words, use the DevOps or DevSecOps model.

“I think most of our customers are comfortable or familiar enough with the DevSecOps process that it’s not as hard a sell as maybe five or 10 years ago,” Cadieux said.

Cadieux said emerging technologies such as 5G telecom, robotic process automation, and artificial intelligence will transform mission delivery and bring all sorts of new efficiencies to agencies. But agency teams must first establish the mission need, she said.

Important to establishing the mission need, she said, is involving all of the parties with a stake in modernization. That includes the users themselves.

“If you don’t talk to them,” Cadieux said, “you’re missing a huge piece of your of your requirements and information to help inform better decisions about what kind of technologies you may or may not need.”

Approach to Enterprise Modernization

Start [modernization] with the need, or what the problem is that your customer is trying to address. Start there, don't start with the solution. Oftentimes, we get that I call it the shiny object syndrome; something new hits the market. Okay, that's great, but what problem are you trying to solve with that? Do you really need that right now?

Continuous Development

The people who will tell you the honest truth about what's working, what's not working, or where they feel [technology] can better support them are your end users. Those are the folks who know more about it than anybody else, even if you're the CIO.

Listen to the full show:

]]>
https://federalnewsnetwork.com/federal-insights/2021/04/digital-modernization-starts-with-the-mission-not-the-digits/feed/ 0