Cloud Computing – Federal News Network https://federalnewsnetwork.com Helping feds meet their mission. Tue, 05 Jul 2022 18:15:12 +0000 en-US hourly 1 https://federalnewsnetwork.com/wp-content/uploads/2017/12/cropped-icon-512x512-1-60x60.png Cloud Computing – Federal News Network https://federalnewsnetwork.com 32 32 FEMA’s enterprise cloud services potentially could lower costs by 30%-to-40% https://federalnewsnetwork.com/ask-the-cio/2022/07/femas-enterprise-cloud-services-potentially-could-lower-costs-by-30-to-40/ https://federalnewsnetwork.com/ask-the-cio/2022/07/femas-enterprise-cloud-services-potentially-could-lower-costs-by-30-to-40/#respond Tue, 05 Jul 2022 18:15:12 +0000 https://federalnewsnetwork.com/?p=4135617 var config_4135498 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/dts.podtrac.com\/redirect.mp3\/pdst.fm\/e\/chrt.fm\/track\/E2G895\/aw.noxsolutions.com\/launchpod\/federal-drive\/mp3\/070522_Jason_web_anr9_d10d653f.mp3?awCollectionId=1146&awEpisodeId=b00324aa-8642-4860-b6da-9689d10d653f&awNetwork=322"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2018\/12\/FD1500-150x150.jpg","title":"A look at how the cloud will help FEMA","description":"[hbidcpodcast podcastid='4135498']nn<em>Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive\u2019s daily audio interviews on\u00a0<\/em><a href="https:\/\/itunes.apple.com\/us\/podcast\/federal-drive-with-tom-temin\/id1270799277?mt=2"><i>Apple Podcasts<\/i><\/a><em>\u00a0or\u00a0<a href="https:\/\/www.podcastone.com\/federal-drive-with-tom-temin?pid=1753589">PodcastOne<\/a>.<\/em>nnThe Federal Emergency Management Agency is expanding its footprint in the cloud. And they are doing it in a bit of a usual way.nnFEMA is partnering with the Agriculture Department and developing a charge-back model to its mission areas.nnJim Rodd, FEMA\u2019s cloud portfolio manager, said as part of modernizing the National Flood Insurance Program, the agency and USDA are using the Google Cloud platform.nn\u201cThey're actually doing it in conjunction with USDA. NFIP is bringing it up in a methodology that will allow us to absorb it into the FEMA enterprise cloud with no issue. It's all our standards and everything,\u201d Rodd said at the recent ACT-IAC Emerging Technology and Innovation Conference.nnRodd said the reason FEMA looked to partner with USDA is two-fold. First, the two agencies partner to help citizens impacted by floods. But secondly, and maybe most important to the discussion around cloud, is Rodd found USDA among the most mature organizations in applying the charge-back model for enterprise cloud services.nn\u201cWhen I first took the position over, I wanted to speak to some other cloud brokers that were in the federal government, and three that popped up was two at DHS, which were U.S. Citizenship and Immigration Service and the Customs and Border Protection directorates. I've talked to them, but neither one of them have a multi cloud solution with a chargeback methodology. So we wanted to make sure we were speaking to somebody in that realm as well. And USDA was pretty much the big dog on the block,\u201d Rodd said. \u201cThey had a very mature cloud doing chargeback and it was multi cloud, so it only made sense to go and talk to them.\u201dn<h2>Buying cloud services in a new way<\/h2>nThis idea of a chargeback model for enterprise services hasn\u2019t been easy for agencies over the last 50-plus years. Federal shared services for financial management and human resources have been out in front of this effort, but the agencies providing these services have struggled to make their case to large agencies for the most part.nnThe General Services Administration\u2019s Cloud Information Center highlights <a href="https:\/\/cic.gsa.gov\/acquisitions\/acquisition-challenges" target="_blank" rel="noopener">several acquisition challenges<\/a>, including advanced metering services from vendors and governance focused on who holds the responsibility of assessing cloud utilization reports for chargeback incentive purposes.nnThe Office of Management and Budget and the Federal CIO Council have been pushing <a href="https:\/\/federalnewsnetwork.com\/reporters-notebook\/2020\/08\/cio-council-reawakens-push-to-use-technology-business-management-standards\/">agencies to implement<\/a> the Technology Business Management (TBM) framework to measure the cost and value of IT services, not necessarily just cloud services. Agencies had to fully implement TBM cost towers as part of their 2023 budget requests that went to OMB earlier this year. But challenges around <a href="https:\/\/federalnewsnetwork.com\/reporters-notebook-jason-miller\/2021\/06\/data-remains-biggest-obstacle-to-meeting-2023-deadline-for-tbm\/">data quality and quantity<\/a> have slowed down this effort over the last five years.nnBut <a href="https:\/\/federalnewsnetwork.com\/reporters-notebook-jason-miller\/2021\/06\/gsa-set-to-alter-cloud-buying-landscape-with-new-policy\/">understanding the costs<\/a> in a multi-cloud environment is why FEMA is pushing forward with the chargeback model.nnRodd said with FEMA already is using Amazon Web Services and Microsoft\u2019s Azure cloud instances and now adding the Google Cloud, it wanted to ensure it knew where and how much it was spending on these services. Former FEMA CIO Lytwaive Hutchinson said earlier this year that the <a href="https:\/\/federalnewsnetwork.com\/ask-the-cio\/2022\/05\/fema-sets-self-imposed-deadline-for-moving-more-applications-to-the-cloud\/">agency\u2019s goal<\/a> to have at least 50% of all of their systems and services that are cloud ready moved into the cloud by the end of 2022.nn\u201cThe thing with the chargeback model is being able to offset cost. That's the name of the game,\u201d he said. \u201cAll sudden our current cloud footprint is probably about $2 million-to-$3 million a year. If we can offset some of that, rather than what is happening right now where we're carrying all of it, as we ingest more clients and more services, we should start to see an offset in costs.\u201dn<h2>Big savings over time<\/h2>nRodd said FEMA mission areas who have turned off on-premise or legacy technology are seeing costs reductions of 30% to 40%.nn\u201cIt's giving our internal and external OCIO clients, the opportunity to really be able to plan efficiently by having all of that in one place,\u201d he said. \u201cThere's obviously a massive cultural shift with moving to the cloud and FEMA is just as aware of that need for a culture shift as anybody else. We try to sell it on the scalability and flexibility, the ability to convert our redundant possibilities East Coast, West Coast, north, south, across this CSP, that CSP. We try to show all that, but they don't really see it because that's the back end. One of the things we like to do when we are briefing to a prospective client who has no knowledge of the cloud, I don't make any promises on price because here's the reality in any government agency for that first year or two, you're running hybrid. You have to maintain that physical environment, especially for somebody with a mission like ours, where we have to be up no matter what. During that time, obviously, you're costs are going to be substantially higher. So I actually stay away from that, or I brutally tell them look, this first year or two, it's actually going to be more expensive. But as soon as we can start turning off your stuff in the physical environment, and shutting that stuff down and killing those contracts, that's when you're going to start to see your costs go down.\u201dnnRodd added in a perfect world, he would like his cloud broker office to break even in terms of costs of providing the enterprise services and receiving funding from mission users.nn\u201cI don't really ever think we're going to get there, but even if we got to 50%, that'd be outstanding,\u201d he said. \u201cWe developed a cost model. What we wanted is a one-stop shop so if a client comes to us and tells us their need, or we help them to develop a solution, we didn't want them to then have to talk to the sustainment folks and get a price and then talk to the license folks and get a price. We tried to make our cost model as inclusive as possible. It covers everything from your basic compute needs, your migration, your authority to operate and your licensing. We're actually adding cyber to it right now.\u201dnnRodd said FEMA wanted to get a third-party expert to confirm its chargeback model would work, and received solid reviews from Gartner. He called it \u201celegant.\u201d"}};

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

The Federal Emergency Management Agency is expanding its footprint in the cloud. And they are doing it in a bit of a usual way.

FEMA is partnering with the Agriculture Department and developing a charge-back model to its mission areas.

Jim Rodd, FEMA’s cloud portfolio manager, said as part of modernizing the National Flood Insurance Program, the agency and USDA are using the Google Cloud platform.

“They’re actually doing it in conjunction with USDA. NFIP is bringing it up in a methodology that will allow us to absorb it into the FEMA enterprise cloud with no issue. It’s all our standards and everything,” Rodd said at the recent ACT-IAC Emerging Technology and Innovation Conference.

Rodd said the reason FEMA looked to partner with USDA is two-fold. First, the two agencies partner to help citizens impacted by floods. But secondly, and maybe most important to the discussion around cloud, is Rodd found USDA among the most mature organizations in applying the charge-back model for enterprise cloud services.

“When I first took the position over, I wanted to speak to some other cloud brokers that were in the federal government, and three that popped up was two at DHS, which were U.S. Citizenship and Immigration Service and the Customs and Border Protection directorates. I’ve talked to them, but neither one of them have a multi cloud solution with a chargeback methodology. So we wanted to make sure we were speaking to somebody in that realm as well. And USDA was pretty much the big dog on the block,” Rodd said. “They had a very mature cloud doing chargeback and it was multi cloud, so it only made sense to go and talk to them.”

Buying cloud services in a new way

This idea of a chargeback model for enterprise services hasn’t been easy for agencies over the last 50-plus years. Federal shared services for financial management and human resources have been out in front of this effort, but the agencies providing these services have struggled to make their case to large agencies for the most part.

The General Services Administration’s Cloud Information Center highlights several acquisition challenges, including advanced metering services from vendors and governance focused on who holds the responsibility of assessing cloud utilization reports for chargeback incentive purposes.

The Office of Management and Budget and the Federal CIO Council have been pushing agencies to implement the Technology Business Management (TBM) framework to measure the cost and value of IT services, not necessarily just cloud services. Agencies had to fully implement TBM cost towers as part of their 2023 budget requests that went to OMB earlier this year. But challenges around data quality and quantity have slowed down this effort over the last five years.

But understanding the costs in a multi-cloud environment is why FEMA is pushing forward with the chargeback model.

Rodd said with FEMA already is using Amazon Web Services and Microsoft’s Azure cloud instances and now adding the Google Cloud, it wanted to ensure it knew where and how much it was spending on these services. Former FEMA CIO Lytwaive Hutchinson said earlier this year that the agency’s goal to have at least 50% of all of their systems and services that are cloud ready moved into the cloud by the end of 2022.

“The thing with the chargeback model is being able to offset cost. That’s the name of the game,” he said. “All sudden our current cloud footprint is probably about $2 million-to-$3 million a year. If we can offset some of that, rather than what is happening right now where we’re carrying all of it, as we ingest more clients and more services, we should start to see an offset in costs.”

Big savings over time

Rodd said FEMA mission areas who have turned off on-premise or legacy technology are seeing costs reductions of 30% to 40%.

“It’s giving our internal and external OCIO clients, the opportunity to really be able to plan efficiently by having all of that in one place,” he said. “There’s obviously a massive cultural shift with moving to the cloud and FEMA is just as aware of that need for a culture shift as anybody else. We try to sell it on the scalability and flexibility, the ability to convert our redundant possibilities East Coast, West Coast, north, south, across this CSP, that CSP. We try to show all that, but they don’t really see it because that’s the back end. One of the things we like to do when we are briefing to a prospective client who has no knowledge of the cloud, I don’t make any promises on price because here’s the reality in any government agency for that first year or two, you’re running hybrid. You have to maintain that physical environment, especially for somebody with a mission like ours, where we have to be up no matter what. During that time, obviously, you’re costs are going to be substantially higher. So I actually stay away from that, or I brutally tell them look, this first year or two, it’s actually going to be more expensive. But as soon as we can start turning off your stuff in the physical environment, and shutting that stuff down and killing those contracts, that’s when you’re going to start to see your costs go down.”

Rodd added in a perfect world, he would like his cloud broker office to break even in terms of costs of providing the enterprise services and receiving funding from mission users.

“I don’t really ever think we’re going to get there, but even if we got to 50%, that’d be outstanding,” he said. “We developed a cost model. What we wanted is a one-stop shop so if a client comes to us and tells us their need, or we help them to develop a solution, we didn’t want them to then have to talk to the sustainment folks and get a price and then talk to the license folks and get a price. We tried to make our cost model as inclusive as possible. It covers everything from your basic compute needs, your migration, your authority to operate and your licensing. We’re actually adding cyber to it right now.”

Rodd said FEMA wanted to get a third-party expert to confirm its chargeback model would work, and received solid reviews from Gartner. He called it “elegant.”

]]>
https://federalnewsnetwork.com/ask-the-cio/2022/07/femas-enterprise-cloud-services-potentially-could-lower-costs-by-30-to-40/feed/ 0
USPTO putting foundational piece of zero trust architecture in place https://federalnewsnetwork.com/cybersecurity/2022/06/uspto-putting-foundational-piece-of-zero-trust-architecture-in-place/ https://federalnewsnetwork.com/cybersecurity/2022/06/uspto-putting-foundational-piece-of-zero-trust-architecture-in-place/#respond Wed, 29 Jun 2022 13:24:04 +0000 https://federalnewsnetwork.com/?p=4127859 var config_4127894 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/dts.podtrac.com\/redirect.mp3\/pdst.fm\/e\/chrt.fm\/track\/E2G895\/aw.noxsolutions.com\/launchpod\/federal-drive\/mp3\/062922_Jason_web_6v9x_f2fc1bbd.mp3?awCollectionId=1146&awEpisodeId=a350e42d-5999-4ba8-a7dc-1acef2fc1bbd&awNetwork=322"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2018\/12\/FD1500-150x150.jpg","title":"USPTO putting foundational piece of zero trust architecture in place","description":"[hbidcpodcast podcastid='4127894']nn<em>Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive\u2019s daily audio interviews on\u00a0<\/em><a href="https:\/\/itunes.apple.com\/us\/podcast\/federal-drive-with-tom-temin\/id1270799277?mt=2"><em><span style="color: #0070c0;">Apple Podcast<\/span><\/em><span style="color: #0070c0;">s<\/span><\/a><em>\u00a0or\u00a0<a href="https:\/\/www.podcastone.com\/federal-drive-with-tom-temin?pid=1753589">PodcastOne<\/a>.<\/em>nnThe U.S. Patent and Trademark Office is taking a huge step to reduce the cyber risks from its employees.nnTime and again, cybersecurity research finds the employee is the weakest cyber link. The fiscal 2020 Federal Information Security Management Act (FISMA) <a href="https:\/\/www.whitehouse.gov\/wp-content\/uploads\/2021\/05\/FY-2020-FISMA-Report-to-Congress.pdf">report to Congress<\/a> said two of the top three risk and vulnerability assessments findings were directly related to employees, spear phishing weaknesses and easily, crack-able passwords. The Office of Management and Budget hasn\u2019t released the 2021 FISMA report to Congress, which typically comes out at the end of May.nnTo that end, USPTO will be among the first agencies to implement a Secure Access Service Edge (SASE) architecture.nn[caption id="attachment_2867404" align="alignright" width="300"]<img class="size-medium wp-image-2867404" src="https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2020\/05\/Jamie-Holcombe-300x200.jpg" alt="Head shot of Jamie Holcomb" width="300" height="200" \/> Jamie Holcombe is the chief information officer (CIO) at the United States Patent and Trademark Office (USPTO). (Photo by Jay Premack\/USPTO)[\/caption]nnJamie Holcombe, the chief information officer for USPTO, said SASE will accelerate USPTO\u2019s journey <a href="https:\/\/federalnewsnetwork.com\/federal-insights\/2022\/06\/disa-already-preparing-for-whats-to-come-after-thunderdome-to-evolve-zero-trust\/">to zero trust<\/a>.nn\u201cI think it's the first foundational piece of the zero trust architecture that we get to actually act upon. So with the executive order, and zero trust architecture, the fact is that it's not one product, it's more of a philosophy. I like SASE as that architectural philosophy to ensure that we can identify users and devices, and apply the policy-base security controls, delivering that secure access to the applications and ensuring that our data is secure,\u201d Holcombe said in an interview with Federal News Network. \u201cThe fact that SASE addresses the architecture and that philosophy around that scope is providing us the first time that we can really concentrate on that architecture and the ability to actually go into it and use products, not just one product, but products in that philosophy for ensuring SASE and zero trust.\u201dnnSASE, which is one of the latest cyber buzzwords, attempts to converge multiple security technologies for web, cloud, data and threat protection into a platform the attempts to protect users, data and applications in the cloud and on-premise.nnThe move toward a SASE model will help eliminate perimeter-based tools and gives security operators a \u201csingle pane of glass\u201d from which to ensure the safety users, data and devices and apply a consistent security policy.nnUSPTO awarded Netskope contract that could be worth $4 million and last as long as 19 months to implement the SASE architecture.nnHolcombe said by implementing the SASE architecture, USPTO will drive security to the edge instead of <a href="https:\/\/federalnewsnetwork.com\/cloud-computing\/2021\/09\/cloud-exchange-usptos-jamie-holcombe\/">just the network<\/a>.nn\u201cWhat we're talking about is the identification of users, the identification of devices and all the things in between the OSI layers [where computers communicate with each other] to put them all together in a secure way,\u201d he said. \u201cNetskope\u2019s product actually provides the ability for that architecture. But there's a lot of other things that you need to plug and play in order to be that secure. So that's what the edge means to me going out and securing not just one part but all the parts in an architecture.\u201dn<h2>Risk scores driving decisions<\/h2>nBeau Hutto, the vice president of federal at Netskope, said this approach lets agencies apply what they know about users, devices and other factors like location to create a risk profile and then apply to in a \u201cleast privileged\u201d way.nn\u201cThe user should have a risk score. The actual device should have a risk score. The data has a sensitivity score. So being able to bring a very basic layer all of that together and what access you give to that data because really the crown jewels is the data, it's no longer the network,\u201d Hutto said. \u201cWhen you go to protect that data, you have to understand the context in which everything's being accessed. That is truly where least privilege zero trust architectures come into play in a significant way.\u201dnnThrough SASE, USPTO is putting the employee and data at the center of the security effort. Holcombe said if they can reduce the ability of the user from clicking a link or give up their network credentials, then the agency\u2019s cyber posture will greatly improve.nn\u201cWhat I like about SASE is the fact that the machine-device control plane is in the realm of the user. I'm just doing a service and I don't care what server it sits on. But when I create that cyber secure session, what I can do that is ensure that machine-device control plane actually has the right risk profile and it's a two-way scoring. It's just as important for the user to be secure as the device is to be secure, and everything in between the application, the data and the network,\u201d he said. \u201cWhat I'm really trying to do is pull that scope that surface area of the user and bring it down into the technical, such that the user doesn't have to care and that it's more of a machine-device control plane. That's the way we get our security done.\u201dnnHutto added creating that platform or single pane of glass breaks down the silos that have built up over the last few decades around security.n<h2>Accelerating the move to the cloud<\/h2>nThrough SASE, USPTO, or any agency for that matter, will capture and analyze cyber information in a more standardized, scalable and agile way.nn\u201cWe've had the opportunity to re-imagine how our security stack can look, should it be a security stack in the cloud or as-a-service? Where the first hit that your user makes is to the service and whether they go on-premise or back out to the cloud, it's just in a very elegant, easy, very performant solution,\u201d Hutto said.nnHolcombe said it will take some time before USPTO fully implements a SASE architecture. He said he will start with the <a href="https:\/\/federalnewsnetwork.com\/technology-main\/2020\/07\/uspto-cio-sees-uptick-in-productivity-it-investments-pay-off-during-pandemic\/">applications already in the cloud<\/a>, about 17% of all applications the agency runs.nn\u201cWe are staging for about the next 17% to 20%. So we'll have around 35% to 40% of our applications in the cloud before the end of the year. That's from almost 3% to 4% two years ago,\u201d he said. \u201cSome of the applications are not there. The ones that are going to be there are in the next 20% to 30%, we're actually refactoring them with our product design teams. We're actually including cybersecurity and testing, and doing the continuous integration and continuous deployment in these new applications. But there's about 30% of our applications that will never go out in the cloud. They are just too old.\u201dnnHolcombe said the more USPTO puts applications and workloads in the cloud and use DevSecOps to continually modernize them, the more it can take advantage of SASE.nn\u201cOne of my design philosophies besides pushing security to the edge is also the fact that I will not deploy something until I know I can rip it out in three years. I want to replace any tool that I put in, because that is the speed in which these tools are being rejuvenated, and there's better tools in three years,\u201d he said. \u201cIf you design something that lasts anywhere from 5 to 10 years, you're wrong. Design it to do what you needed to do in three years, and then look to other things to replace it. The return on investment needs to be within three years or don't do it.\u201d"}};

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

The U.S. Patent and Trademark Office is taking a huge step to reduce the cyber risks from its employees.

Time and again, cybersecurity research finds the employee is the weakest cyber link. The fiscal 2020 Federal Information Security Management Act (FISMA) report to Congress said two of the top three risk and vulnerability assessments findings were directly related to employees, spear phishing weaknesses and easily, crack-able passwords. The Office of Management and Budget hasn’t released the 2021 FISMA report to Congress, which typically comes out at the end of May.

To that end, USPTO will be among the first agencies to implement a Secure Access Service Edge (SASE) architecture.

Head shot of Jamie Holcomb
Jamie Holcombe is the chief information officer (CIO) at the United States Patent and Trademark Office (USPTO). (Photo by Jay Premack/USPTO)

Jamie Holcombe, the chief information officer for USPTO, said SASE will accelerate USPTO’s journey to zero trust.

“I think it’s the first foundational piece of the zero trust architecture that we get to actually act upon. So with the executive order, and zero trust architecture, the fact is that it’s not one product, it’s more of a philosophy. I like SASE as that architectural philosophy to ensure that we can identify users and devices, and apply the policy-base security controls, delivering that secure access to the applications and ensuring that our data is secure,” Holcombe said in an interview with Federal News Network. “The fact that SASE addresses the architecture and that philosophy around that scope is providing us the first time that we can really concentrate on that architecture and the ability to actually go into it and use products, not just one product, but products in that philosophy for ensuring SASE and zero trust.”

SASE, which is one of the latest cyber buzzwords, attempts to converge multiple security technologies for web, cloud, data and threat protection into a platform the attempts to protect users, data and applications in the cloud and on-premise.

The move toward a SASE model will help eliminate perimeter-based tools and gives security operators a “single pane of glass” from which to ensure the safety users, data and devices and apply a consistent security policy.

USPTO awarded Netskope contract that could be worth $4 million and last as long as 19 months to implement the SASE architecture.

Holcombe said by implementing the SASE architecture, USPTO will drive security to the edge instead of just the network.

“What we’re talking about is the identification of users, the identification of devices and all the things in between the OSI layers [where computers communicate with each other] to put them all together in a secure way,” he said. “Netskope’s product actually provides the ability for that architecture. But there’s a lot of other things that you need to plug and play in order to be that secure. So that’s what the edge means to me going out and securing not just one part but all the parts in an architecture.”

Risk scores driving decisions

Beau Hutto, the vice president of federal at Netskope, said this approach lets agencies apply what they know about users, devices and other factors like location to create a risk profile and then apply to in a “least privileged” way.

“The user should have a risk score. The actual device should have a risk score. The data has a sensitivity score. So being able to bring a very basic layer all of that together and what access you give to that data because really the crown jewels is the data, it’s no longer the network,” Hutto said. “When you go to protect that data, you have to understand the context in which everything’s being accessed. That is truly where least privilege zero trust architectures come into play in a significant way.”

Through SASE, USPTO is putting the employee and data at the center of the security effort. Holcombe said if they can reduce the ability of the user from clicking a link or give up their network credentials, then the agency’s cyber posture will greatly improve.

“What I like about SASE is the fact that the machine-device control plane is in the realm of the user. I’m just doing a service and I don’t care what server it sits on. But when I create that cyber secure session, what I can do that is ensure that machine-device control plane actually has the right risk profile and it’s a two-way scoring. It’s just as important for the user to be secure as the device is to be secure, and everything in between the application, the data and the network,” he said. “What I’m really trying to do is pull that scope that surface area of the user and bring it down into the technical, such that the user doesn’t have to care and that it’s more of a machine-device control plane. That’s the way we get our security done.”

Hutto added creating that platform or single pane of glass breaks down the silos that have built up over the last few decades around security.

Accelerating the move to the cloud

Through SASE, USPTO, or any agency for that matter, will capture and analyze cyber information in a more standardized, scalable and agile way.

“We’ve had the opportunity to re-imagine how our security stack can look, should it be a security stack in the cloud or as-a-service? Where the first hit that your user makes is to the service and whether they go on-premise or back out to the cloud, it’s just in a very elegant, easy, very performant solution,” Hutto said.

Holcombe said it will take some time before USPTO fully implements a SASE architecture. He said he will start with the applications already in the cloud, about 17% of all applications the agency runs.

“We are staging for about the next 17% to 20%. So we’ll have around 35% to 40% of our applications in the cloud before the end of the year. That’s from almost 3% to 4% two years ago,” he said. “Some of the applications are not there. The ones that are going to be there are in the next 20% to 30%, we’re actually refactoring them with our product design teams. We’re actually including cybersecurity and testing, and doing the continuous integration and continuous deployment in these new applications. But there’s about 30% of our applications that will never go out in the cloud. They are just too old.”

Holcombe said the more USPTO puts applications and workloads in the cloud and use DevSecOps to continually modernize them, the more it can take advantage of SASE.

“One of my design philosophies besides pushing security to the edge is also the fact that I will not deploy something until I know I can rip it out in three years. I want to replace any tool that I put in, because that is the speed in which these tools are being rejuvenated, and there’s better tools in three years,” he said. “If you design something that lasts anywhere from 5 to 10 years, you’re wrong. Design it to do what you needed to do in three years, and then look to other things to replace it. The return on investment needs to be within three years or don’t do it.”

]]>
https://federalnewsnetwork.com/cybersecurity/2022/06/uspto-putting-foundational-piece-of-zero-trust-architecture-in-place/feed/ 0
USDA has been trying to consolidate 17 networks for a decade, now it has the money to do it https://federalnewsnetwork.com/it-modernization/2022/06/usda-has-been-trying-to-consolidate-17-networks-for-a-decade-now-it-has-the-money-to-do-it/ https://federalnewsnetwork.com/it-modernization/2022/06/usda-has-been-trying-to-consolidate-17-networks-for-a-decade-now-it-has-the-money-to-do-it/#respond Wed, 22 Jun 2022 14:49:17 +0000 https://federalnewsnetwork.com/?p=4114550 Gary Washington is trying to do something at least four other Agriculture Department chief information officers have promised to do, but came up well short.

The difference this time as USDA tries, once again, to consolidate 17 disparate networks into one is Washington has real funding.

USDA will receive $64 million from the Technology Modernization Fund Board for this project. This was USDA’s fifth award under TMF.

The TMF Board made two other awards on June 21 as well, giving the Homeland Security Department $26.9 million to modernize its Homeland Security Information Network (HSIN) and $3.9 million to the Federal Trade Commission to procure a security operations center-as-a-service (SOCaaS) in order to implement a zero trust architecture.

The board has made 14 awards since receiving $1 billion from the American Rescue Plan Act in 2021. It still has more than $650 million left in the account. The Office of Management and Budget and the General Services Administration on June 16 committed to spending $100 million on customer experience projects that cut wait times for public-facing federal services, as well as excessive paperwork and other barriers.

While DHS signaled its desire to update HSIN in an April request for information and the FTC’s move to zero trust is part of the Biden administration’s overall push to improve cybersecurity, the USDA award is what Congress created the TMF for in the first place — to give agencies a boost to get over the modernization hump.

“This investment will be used for the USDANet startup costs that reduces the number of USDA-owned and operated networks from 17 to 1 and will result in $734 million in estimated costs savings/avoidance,” the TMF website states. “The lowered total cost of ownership by the USDA means Mission Areas can allocate greater portions of their IT spending from basic infrastructure to public-facing applications that promote conservation, goodwill, and optimization of resources.”

Washington said at a recent event sponsored by ACT-IAC on June 14 said this effort will be done through the Enterprise Infrastructure Solutions (EIS) vehicle run by GSA.

Gary Washington is the USDA chief information officer.

“We have submitted our project plan. We’ve been working with our contractor, and we’re aggressively working toward meeting the milestones that we have to set,” Washington said during the panel discussion on EIS. “We had approximately 54% disconnect rate, currently, so we’re going try to aggressively get that up. But the primary thing is we’re going to transition to a managed service model, where we can, like modernize the network periodically like we’re supposed to. It provide better service because we’re all over the country, and we have a national presence as well.”

USDA awarded Lumen a task order under EIS in January that could be for as much as 11 years and could be worth $1.2 billion.

One of the key factors for the TMF Board to make an award is whether the agency has an existing project and contract in place.

Long-running challenge at USDA

Previous attempts to consolidate and modernize USDA’s networks have struggled but not for lack of trying or will. For example in 2011, then Secretary Tom Vilsack — who is back as secretary today — approved a report detailing 379 recommendations for improving agency operations and saving administrative money to reinvest into citizen services.

Washington even called in the IT Centers of Excellence in 2018 to help wrangle these networks and move applications to the cloud. While the IT CoEs helped the agency move more applications and systems to the cloud, Washington said EIS gives them an opportunity to rethink how their supporting infrastructure supports their cloud environment and cloud strategy.

“It’s going to really happen, and I’m going make sure I will be here to make sure it does happen,” Washington said about the network consolidation. “We were very excited about that. Our leadership is excited about it. Actually, I briefed the secretary on this last week. So it’s just really a matter of getting our equipment and making sure that we would pick up the pace in terms of actually implementing this solution. We’ve got a lot riding on this.”

As for the investments at DHS and FTC, the board continues to signal its desire to choose projects that could have broad impacts on users or on demonstrating how something could work.

DHS released an RFI to modernize HSIN in April, seeking industry feedback on how to take the current platform from complex, costly and not optimized for cloud-based and mobile features to one that better supports end-users and rapidly addresses threats to homeland security.

“DHS looks to redefine information accessibility and build a modern, comprehensive information sharing platform using cloud-based technologies to increase speed, mobility, and access to unclassified information,” the RFI stated.

Cloud native, modern tools

With the TMF money, DHS says it will “rebuild its information sharing system as a cloud native platform with modern tools and technologies. The new platform will be capable of scaling up to meet peaks in demand during times of emergency while also offering significant new features including improved access and security; better content sharing and discoverability; and greater emphasis on connecting HSIN’s partners to each other for closer collaboration. DHS will also use the TMF investment to build a platform that is more responsive to a post pandemic work environment for users with easy and secure access on mobile platforms and other devices.”

DHS expects the extra money to accelerate its modernization effort to create a system that is more flexible, offers a better user experience and costs less.

The board’s award to the FTC — its fourth specifically around zero trust — is almost a proof-of-concept that other small agencies could take advantage of.

The FTC will use the extra funding to replace its existing security operations center that is built for government-operated data centers and has trouble scaling to address growing cyber threats.

“[T]he FTC will expedite its SOCaaS implementation using security services and trusted cloud service providers to host sensitive FTC data. This comprehensive approach will greatly reduce the risk of bad actors executing a ransomware or other cyber attack,” the TMF website stated. “It will also reduce the number of man hours currently expended to respond to indicators of cyber incidents. These hours could then be repurposed to continue improvements to the agency’s many operational systems for merger filing review and reporting of fraud. The agency is collaborating on this effort with other federal cyber security leaders, including the Department of Homeland Security, to share best practices.”

Raghav Vajjhala, the FTC CIO, said in July 2021 that half of all of its systems and applications are in the cloud and is upgrading its network to be software-defined. These modernization efforts need a modern security architecture, he said.

]]>
https://federalnewsnetwork.com/it-modernization/2022/06/usda-has-been-trying-to-consolidate-17-networks-for-a-decade-now-it-has-the-money-to-do-it/feed/ 0
CISA provides agencies with long-awaited cloud security guidance https://federalnewsnetwork.com/cybersecurity/2022/06/cisa-provides-agencies-with-long-awaited-cloud-security-guidance/ https://federalnewsnetwork.com/cybersecurity/2022/06/cisa-provides-agencies-with-long-awaited-cloud-security-guidance/#respond Fri, 17 Jun 2022 11:28:28 +0000 https://federalnewsnetwork.com/?p=4106365 The Cybersecurity and Infrastructure Security Agency has released new guidance for applying modern network security practices across multiple cloud computing scenarios. It’s another evolution in a years-long effort to make it easier for agencies to securely adopt cloud services.

CISA published the  draft Trusted Internet Connections (TIC) 3.0 Cloud Use Case on Thursday. CISA is accepting comments on the draft document through July 22 before it works toward publishing a final version.

In a blog post, CISA Executive Assistant Director for Cybersecurity Eric Goldstein wrote that the use case builds upon last May’s cybersecurity executive order and CISA’s Cloud Security Technical Reference Architecture. An initial version of the “TRA” was published last fall.

“With the appetite for cloud guidance growing, this new CISA resource will help federal agencies effectively leverage applicable aspects of the Cloud Security TRA and work to achieve a mandate in the EO for secure cloud services,” Goldstein wrote.

The cloud use case has been highly anticipated since the White House Office of Management and Budget rescinded previous TIC policy and directed CISA to update the TIC initiative nearly three years ago.

The September 2019 memorandum from then-Deputy Director for Management Margaret Weichert identified previous requirements for agencies to flow traffic through a physical TIC access point as “an obstacle to the adoption of cloud-based infrastructure.”

Her memo directed CISA to publish TIC use cases to identify alternative security controls for scenarios when traffic is not required to flow through a TIC access point. The cloud use case is the final product to drop in the TIC 3.0 series. CISA has already published the Traditional TIC Use Case, Branch Office Use Case, and Remote User Use Case.

Ross Nodurft, former head of OMB’s cyber team and executive director of tech industry group Alliance for Digital Innovation, welcomed the new CISA guidance, calling it a “policy release valve” for agencies looking at cloud security architectures beyond the old TIC access points.

“It’s been a very long time coming,” Nodurft said. “And I’m frankly, I know a lot of the agencies have been asking for it, because it provides a bunch of different iterations of what architectures could and should look like.”

The document covers security considerations across Infrastructure-as-a-Service (IaaS), Platform-as-a-Service (PaaS), Software-as-a-Service (SaaS), and Email-as-a-Service (EaaS) deployments.

“This guidance also incorporates cloud-specific considerations, such as the shared services model and cloud security posture management principles outlined in the Cloud Security TRA,” Goldstein wrote in his blog. “Another unique aspect of this use case is that it was written from the vantage point of cloud-hosted services, as opposed to from the vantage point of the client accessing these services.”

It further breaks the guidance down into different “security patterns,” such as when an agency campus network connects with a cloud service provider versus when a remote user connects to cloud resources.

The idea is to give agencies more clarity on how they can securely adopt cloud services, especially after last May’s executive order directed agencies to “accelerate the move to secure cloud services.” OMB has also directed agencies to start adopting zero trust architectures by, in part, leveraging the security features in cloud services.

“While this use case can be leveraged as agencies move towards Zero Trust Architectures, implementation of zero trust requires additional controls, additional rigor of applying security capabilities, and measures beyond those detailed in this use case,” the use case document states.

Nodurft said another important advancement in the cloud use case is the discussion around telemetry, or network data collected to detect cyber threats. TIC access points have traditionally collected telemetry data.

The guidance states that agencies should track access to “all agency data and applications in the cloud and analyze all access events for suspicious behaviors,” while noting that many cloud service providers have capabilities in place for logging, monitoring and analysis of telemetry data.

“We want to start talking to the agencies about what type of telemetry data we can capture, given the security tools and security capabilities you guys are employing in your security architectures,” Nodurft said. “And then what does that look like from a centralized log aggregation repository? Are we are we going to really finally be able to have a centralized view from CISA and are agencies going to be able to then use that same telemetry information to look at their own security networks in a new way?”

]]>
https://federalnewsnetwork.com/cybersecurity/2022/06/cisa-provides-agencies-with-long-awaited-cloud-security-guidance/feed/ 0
Commerce BIS, Coast Guard closing in on infrastructure modernization wins https://federalnewsnetwork.com/reporters-notebook-jason-miller/2022/06/commerce-bis-coast-guard-closing-in-on-infrastructure-modernization-wins/ https://federalnewsnetwork.com/reporters-notebook-jason-miller/2022/06/commerce-bis-coast-guard-closing-in-on-infrastructure-modernization-wins/#respond Wed, 15 Jun 2022 15:55:13 +0000 https://federalnewsnetwork.com/?p=4103717 var config_4101968 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/dts.podtrac.com\/redirect.mp3\/pdst.fm\/e\/chrt.fm\/track\/E2G895\/aw.noxsolutions.com\/launchpod\/federal-drive\/mp3\/061422_Jason_web_e6y9_5edb59a5.mp3?awCollectionId=1146&awEpisodeId=104054aa-7b61-477f-94bb-a2655edb59a5&awNetwork=322"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2018\/12\/FD1500-150x150.jpg","title":"A wrap-up of ACT-IAC Emerging Technology and Innovation Conference","description":"[hbidcpodcast podcastid='4101968']nn<em>Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive\u2019s daily audio interviews on\u00a0<\/em><a href="https:\/\/itunes.apple.com\/us\/podcast\/federal-drive-with-tom-temin\/id1270799277?mt=2"><i>Apple Podcasts<\/i><\/a><em>\u00a0or\u00a0<a href="https:\/\/www.podcastone.com\/federal-drive-with-tom-temin?pid=1753589">PodcastOne<\/a>.<\/em>nnThe return of in-person conferences still is a bit weird. As most attendees will say, it\u2019s great to see people in person, but it\u2019s less fun to wear \u201creal\u201d clothes and shoes. The \u201cbusiness on the top and vacation on the bottom (dress shirt and shorts)\u201d doesn\u2019t work well when you are in a hotel or conference center for most people.nnMaybe the best part of the return to in-person events, at least for intrepid reporters, is the ability to ask follow-up questions after a presentation or speech. That is when you turn a story that is likely to be a lemon into sweet lemonade.nnAt the recent Emerging Technology and Innovation Conference sponsored by ACT-IAC in Cambridge, Maryland, the lemonade was flowing thanks to the bevy of speakers who were willing to talk about all the good things happening in their agency.nnFrom Army chief information officer Raj Iyer offering an update on <a href="https:\/\/federalnewsnetwork.com\/army\/2022\/05\/byod-app-consolidation-next-for-army-digital-transformation\/">his digital transformation efforts<\/a> to Sonny Hashmi, the commissioner of the Federal Acquisition Service in the General Services Administration, talking about the latest contract to <a href="https:\/\/federalnewsnetwork.com\/contractsawards\/2022\/05\/why-gsa-believes-its-new-cloud-services-contract-is-different-than-past-efforts\/">buy cloud services<\/a>, to Stacie Alboum talking about <a href="https:\/\/federalnewsnetwork.com\/technology-main\/2022\/05\/an-nih-technology-executive-moves-from-the-health-field-to-banking-and-finance\/">her new job<\/a> at the Federal Deposit Insurance Corporation as deputy director of enterprise strategy, the news flowed like, well lemonade.nnBut here are three items you may have missed from the event.n<h2>AFWERX moving back to DC<\/h2>nThe Air Force\u2019s innovation arm missed the Washington, D.C. metro area after all.nn<a href="https:\/\/afwerx.com\/spark_\/">AFWERX<\/a> closed its offices in Arlington, Virginia during the pandemic, figuring it would use its offices in Las Vegas and Austin, Texas as places to recruit innovative companies.nnBut like in Godfather Part III, AFWERX may have been screaming\u00a0 \u201cjust when I thought I was out, they pull me back in" to Washington, D.C.nnGarrett Custons, a Spark cell director at AFWERX, said the organization is looking for new space in the D.C. metro area.nn\u201cIt\u2019s really a blank slate with what it could look like,\u201d Custons said. \u201cWe want to build out an incubator in the D.C. area. We\u2019d love it to be co-located with other organizations in the government innovator space. We don\u2019t just the space, but a place where tools and products can be tested.\u201dnnAFWERX, which the Air Force launched in July 2017, focuses on accelerating agile and affordable capabilities by teaming innovative technology developers in the private sector with Airman and Guardian talent. \u00a0In 2020, the Air Force <a href="https:\/\/federalnewsnetwork.com\/dod-reporters-notebook-jared-serbu\/2020\/09\/industry-holds-its-breath-on-impact-of-trump-diversity-training-order\/">split AFWERX<\/a> into three different branches: AFVentures, Spark and Prime. The <a href="https:\/\/afwerx.com\/spark_\/" target="_blank" rel="noopener">Spark branch<\/a> is focused on empowering <a href="https:\/\/federalnewsnetwork.com\/dod-reporters-notebook-jared-serbu\/2020\/07\/dods-7b-household-goods-contract-takes-an-even-stranger-turn\/">innovation at the operational edge<\/a>.nnCustons said the decision to rethink the need for an office in the D.C. area is based on two factors. The first is internal growth of staff. The second is number of vendors in D.C. metro area.nn\u201cThis is where the decision makers are,\u201d he said. \u201cIt\u2019s a logical progression of the lifecycle of AFWERX to help companies get into the federal market.\u201dnnAFWERX has money set-aside for the office space, but isn\u2019t against the idea of sharing space with other agencies or innovation cells.nnCustons said one option would be to share space with the Office of the Undersecretary of Defense for Research and Engineering and the National Security Innovation Network in Arlington, Virginia.nn\u201cIf a government organization has office space, we\u2019d like to talk to them. It\u2019s hard to know what is available and what\u2019s out there,\u201d he said. \u201cWe are talking to the General Services Administration because they have collaboration space that isn\u2019t being used as much as they thought, so maybe partnership play there.\u201dn<h2>Commerce BIS sprint to the cloud<\/h2>nYou\u2019d think moving to the cloud would by now would be pass\u00e9. Agencies have been talking about it for more than a decade.nnFor the Commerce Department\u2019s Bureau of Industry and Security, cloud services represent an entirely new way of doing business.nnMike Palmer, associate chief information officer for BIS, said the goal of moving to the cloud is, of course, IT modernization. But the bigger win will for BIS is how the cloud services will free up data and break down silos.nn\u201cWe\u2019ve focused over the last six months on upgrading our infrastructure. In January, we decided to take our entire infrastructure to the cloud and out of this archaic on-premise based infrastructure,\u201d he said. \u201cBy July 1, our six month move of our entire infrastructure to the cloud should be complete. In the meantime, in parallel, we are starting to do some interesting things with data. It gives us more flexibility to make quicker decisions.\u201dnnPalmer said BIS is launching a pilot program around a data warehouse and data sharing platform to improve how they work with the intelligence and law enforcement communities as well as conducting a pilot to take some of its data from licensing offers and turn it into export control impact.nn\u201cOne of the things we believe in is trying things on a smaller scale and expand it from there so \u00a0quick, small investment to prove out a concept,\u201d he said. \u201cThe next phase of our product lifecycle modernization effort is to do a lot of user research over the summer as part of our enterprise modernization activities.\u201dnnA BIS spokesperson offered a few more details by email.nnThe spokesperson said the <a href="https:\/\/federalnewsnetwork.com\/federal-insights\/2022\/01\/three-perspectives-on-network-modernization-fail-fast-fail-small-and-succeed\/">move to the cloud<\/a> will set the foundation for a broader modernization journey that includes creating new data sharing capabilities, public-facing digital services and a zero trust cybersecurity architecture.\u00a0 The move to the cloud is expected to improve BIS\u2019s operational resiliency and security, reduce costs, and provide modern tools for developing new software applications that will improve the BIS customer experience.nnPalmer said at the event that one of the biggest challenges for BIS is getting the workforce comfortable with using cloud services and no longer being in a physical environment.nnBIS expects the infrastructure modernization to save money, but Palmer said the CIO\u2019s office still is finalizing those details.n<h2>Coast Guard less disconnected<\/h2>nThe Coast Guard Commandant\u2019s tech revolution will not be televised, but it now will be on Zoom or Microsoft Teams.nnThat\u2019s right, major cutters now have enough bandwidth to use video teleconference platforms.nnBrian Campo, the Coast Guard\u2019s deputy CIO, said the service recently <a href="https:\/\/federalnewsnetwork.com\/defense-main\/2020\/04\/covid-19-highlights-coast-guards-issues-with-it-modernization\/">upgraded<\/a> the communication bandwidth for all major cutters, which are out to sea 180 to 200 days a year.nn\u201cThe Coast Guard has been going out with Navy fleets for the last several years into places like Indo-PACOM and around the horn of Africa, but also going up into the Arctic. These are places were communications are really challenging. So one of the thing we have been trying to do is upgrade equipment, working with industry partners and looking at different communications links we could use,\u201d he said. \u201cOne of the most amazing things have done in about the last year is we\u2019ve doubled connectivity to the major cutters. What we have been able to do is upgrade them so that they have enough bandwidth so now on the morale side in some of the mess decks and personnel areas, they can actually get what we would call \u2018dirty\u2019 internet to be able to send email back to loved ones. Just recently we just doubled their internet again so they can actually do video teleconferences using Teams and Zoom to actually reach back and talk with their loved ones.\u201dnnFormer Coast Guard Commandant Adm. Karl Schultz, who retired on June 1, made the increase of bandwidth to cutters a central part of his <a href="https:\/\/www.dcms.uscg.mil\/Portals\/10\/CG-6\/roadmap\/C5i-roadmap-FINAL-v6.pdf" target="_blank" rel="noopener">Tech Revolution plan<\/a>.nnThe Tech Revolution Plan includes four other priorities: Data to decisions, software, mobility and the cloud, cyber readiness and command, control, communications, computers, cyber and intelligence (C5I).nnCampo said the Coast Guard now is adding two new lines of effort command and control and navigation.nn\u201cEach of those two new systems are game changing to the Coast Guard. They are systems we have been leveraging from the Defense Department that we will be retiring in the next few years,\u201d he said. \u201cWe are trying to build out some new replacements for those systems and taking a different approach. We are leveraging what we did in the first half of the tech revolution bringing in things like data, making data part of what we do for our C2 systems, making sure as we develop navigational systems we are leveraging the technology through commercial satellite communications. We are thinking about how we can use artificial intelligence to actually build out navigation systems that can manage these over congested ports and work with the shippers to give them more information as they come into a port.\u201d"}};

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

The return of in-person conferences still is a bit weird. As most attendees will say, it’s great to see people in person, but it’s less fun to wear “real” clothes and shoes. The “business on the top and vacation on the bottom (dress shirt and shorts)” doesn’t work well when you are in a hotel or conference center for most people.

Maybe the best part of the return to in-person events, at least for intrepid reporters, is the ability to ask follow-up questions after a presentation or speech. That is when you turn a story that is likely to be a lemon into sweet lemonade.

At the recent Emerging Technology and Innovation Conference sponsored by ACT-IAC in Cambridge, Maryland, the lemonade was flowing thanks to the bevy of speakers who were willing to talk about all the good things happening in their agency.

From Army chief information officer Raj Iyer offering an update on his digital transformation efforts to Sonny Hashmi, the commissioner of the Federal Acquisition Service in the General Services Administration, talking about the latest contract to buy cloud services, to Stacie Alboum talking about her new job at the Federal Deposit Insurance Corporation as deputy director of enterprise strategy, the news flowed like, well lemonade.

But here are three items you may have missed from the event.

AFWERX moving back to DC

The Air Force’s innovation arm missed the Washington, D.C. metro area after all.

AFWERX closed its offices in Arlington, Virginia during the pandemic, figuring it would use its offices in Las Vegas and Austin, Texas as places to recruit innovative companies.

But like in Godfather Part III, AFWERX may have been screaming  “just when I thought I was out, they pull me back in” to Washington, D.C.

Garrett Custons, a Spark cell director at AFWERX, said the organization is looking for new space in the D.C. metro area.

“It’s really a blank slate with what it could look like,” Custons said. “We want to build out an incubator in the D.C. area. We’d love it to be co-located with other organizations in the government innovator space. We don’t just the space, but a place where tools and products can be tested.”

AFWERX, which the Air Force launched in July 2017, focuses on accelerating agile and affordable capabilities by teaming innovative technology developers in the private sector with Airman and Guardian talent.  In 2020, the Air Force split AFWERX into three different branches: AFVentures, Spark and Prime. The Spark branch is focused on empowering innovation at the operational edge.

Custons said the decision to rethink the need for an office in the D.C. area is based on two factors. The first is internal growth of staff. The second is number of vendors in D.C. metro area.

“This is where the decision makers are,” he said. “It’s a logical progression of the lifecycle of AFWERX to help companies get into the federal market.”

AFWERX has money set-aside for the office space, but isn’t against the idea of sharing space with other agencies or innovation cells.

Custons said one option would be to share space with the Office of the Undersecretary of Defense for Research and Engineering and the National Security Innovation Network in Arlington, Virginia.

“If a government organization has office space, we’d like to talk to them. It’s hard to know what is available and what’s out there,” he said. “We are talking to the General Services Administration because they have collaboration space that isn’t being used as much as they thought, so maybe partnership play there.”

Commerce BIS sprint to the cloud

You’d think moving to the cloud would by now would be passé. Agencies have been talking about it for more than a decade.

For the Commerce Department’s Bureau of Industry and Security, cloud services represent an entirely new way of doing business.

Mike Palmer, associate chief information officer for BIS, said the goal of moving to the cloud is, of course, IT modernization. But the bigger win will for BIS is how the cloud services will free up data and break down silos.

“We’ve focused over the last six months on upgrading our infrastructure. In January, we decided to take our entire infrastructure to the cloud and out of this archaic on-premise based infrastructure,” he said. “By July 1, our six month move of our entire infrastructure to the cloud should be complete. In the meantime, in parallel, we are starting to do some interesting things with data. It gives us more flexibility to make quicker decisions.”

Palmer said BIS is launching a pilot program around a data warehouse and data sharing platform to improve how they work with the intelligence and law enforcement communities as well as conducting a pilot to take some of its data from licensing offers and turn it into export control impact.

“One of the things we believe in is trying things on a smaller scale and expand it from there so  quick, small investment to prove out a concept,” he said. “The next phase of our product lifecycle modernization effort is to do a lot of user research over the summer as part of our enterprise modernization activities.”

A BIS spokesperson offered a few more details by email.

The spokesperson said the move to the cloud will set the foundation for a broader modernization journey that includes creating new data sharing capabilities, public-facing digital services and a zero trust cybersecurity architecture.  The move to the cloud is expected to improve BIS’s operational resiliency and security, reduce costs, and provide modern tools for developing new software applications that will improve the BIS customer experience.

Palmer said at the event that one of the biggest challenges for BIS is getting the workforce comfortable with using cloud services and no longer being in a physical environment.

BIS expects the infrastructure modernization to save money, but Palmer said the CIO’s office still is finalizing those details.

Coast Guard less disconnected

The Coast Guard Commandant’s tech revolution will not be televised, but it now will be on Zoom or Microsoft Teams.

That’s right, major cutters now have enough bandwidth to use video teleconference platforms.

Brian Campo, the Coast Guard’s deputy CIO, said the service recently upgraded the communication bandwidth for all major cutters, which are out to sea 180 to 200 days a year.

“The Coast Guard has been going out with Navy fleets for the last several years into places like Indo-PACOM and around the horn of Africa, but also going up into the Arctic. These are places were communications are really challenging. So one of the thing we have been trying to do is upgrade equipment, working with industry partners and looking at different communications links we could use,” he said. “One of the most amazing things have done in about the last year is we’ve doubled connectivity to the major cutters. What we have been able to do is upgrade them so that they have enough bandwidth so now on the morale side in some of the mess decks and personnel areas, they can actually get what we would call ‘dirty’ internet to be able to send email back to loved ones. Just recently we just doubled their internet again so they can actually do video teleconferences using Teams and Zoom to actually reach back and talk with their loved ones.”

Former Coast Guard Commandant Adm. Karl Schultz, who retired on June 1, made the increase of bandwidth to cutters a central part of his Tech Revolution plan.

The Tech Revolution Plan includes four other priorities: Data to decisions, software, mobility and the cloud, cyber readiness and command, control, communications, computers, cyber and intelligence (C5I).

Campo said the Coast Guard now is adding two new lines of effort command and control and navigation.

“Each of those two new systems are game changing to the Coast Guard. They are systems we have been leveraging from the Defense Department that we will be retiring in the next few years,” he said. “We are trying to build out some new replacements for those systems and taking a different approach. We are leveraging what we did in the first half of the tech revolution bringing in things like data, making data part of what we do for our C2 systems, making sure as we develop navigational systems we are leveraging the technology through commercial satellite communications. We are thinking about how we can use artificial intelligence to actually build out navigation systems that can manage these over congested ports and work with the shippers to give them more information as they come into a port.”

]]>
https://federalnewsnetwork.com/reporters-notebook-jason-miller/2022/06/commerce-bis-coast-guard-closing-in-on-infrastructure-modernization-wins/feed/ 0
Marines aim to solve the DDIL challenge https://federalnewsnetwork.com/reporters-notebook-jason-miller/2022/06/marines-aim-to-solve-the-ddil-challenge/ https://federalnewsnetwork.com/reporters-notebook-jason-miller/2022/06/marines-aim-to-solve-the-ddil-challenge/#respond Tue, 14 Jun 2022 16:38:07 +0000 https://federalnewsnetwork.com/?p=4101923 The Defense Department has always prepared to fight in an environment that is austere, stretches supply lines and unfriendly, to put it mildly.

But that preparation focused mainly around kinetic warfare where Marines or soldiers would have to face an enemy that was, relatively speaking, close and understood.

Todd Harrison, a senior associate in the Aerospace Security Project and Defense Budget Analysis for the Center for Strategic and International Security (CSIS) wrote in a 2021 report that “For some types of non-kinetic attack, third parties may not be able to see that an attack has occurred, or the party being attacked may not know right away who is attacking. For these reasons, non-kinetic attacks may be perceived as less escalatory in some situations, although this remains a point of debate. It can be difficult to determine if some non-kinetic forms of attack are effective, particularly if the effects are not publicly visible. And some methods of attack — such as exploiting zero-day vulnerabilities in a cyberattack — may have a limited period of effectiveness before an adversary develops defenses against them.”

The non-kinetic attacks are not limited to just weapons systems, but logistics to move supplies and troops, communications to make data sharing more difficult and GPS jamming and spoofing.

Today, the Marines are preparing for an environment that is disconnected, denied, intermittent and/or with limited bandwidth (DDIL) where the enemy could be hundreds of miles away, behind screens and impacting both kinetic and non-kinetic capabilities.

The Marine Corps awarded General Dynamics IT (GDIT) a task order under the Defense Enterprise Office Solutions (DEOS) contract to test out how they can receive Microsoft Office capabilities both on-premise and in the cloud in a classified environment approved at the secret level.

The Defense Information Systems Agency and the General Services Administration awarded GDIT the 10-year DEOS contract that has a $7.6 billion ceiling in August 2019. DISA began migrating users to DEOS in January 2021 after protests and corrective action delayed the implementation.

Navy leading DDIL working group

Jim Matney, vice president and general manager of the DISA and Enterprise Services Sector for GDIT’s defense division, said in an email to Federal News Network that GDIT already is supporting an unclassified environment for these services that is rated at impact level 5 (IL5). He said through this proof of concept that mainly will be done in a lab environment, the Marines will be able to see how the enterprise collaboration tools can work in DDIL environments.

The six-month project is worth under $1 million.

The Marine Corps Tactical Systems Support Activity (MCTSSA) has put together a DoD DDIL lab environment where GDIT will evaluate these proposed architectures and developed capabilities.

GDIT says it also will partner with Microsoft to test capabilities, investigate scenarios and provide applicable recommendations for mission partners deployed in a DDIL environment.

“[T]hese collaboration services must also operate on-premises. As cloud service providers are providing more software-as-a-service (SaaS) offerings to support collaboration, such as Office 365, users must have access to the cloud to leverage these capabilities,” Matney said. “The challenge then becomes ensuring the on-premises solution used to support DDIL in an outside the continental U.S. (OCONUS) environment can interface with the enterprise capability that is being used in CONUS.”

Matney said the on-premises collaborative capabilities, such as Microsoft Exchange, Skype for Business and SharePoint, must remain and integrate with the cloud-based services.

GDIT says the proof of concept will include testing several different scenarios to access capabilities including word processing and spreadsheets, email and calendar and file sharing and instant messaging.

All of this is helping the DoD figure out how to deploy DEOS in DDIL environments, where reliable and timely connectivity to warfighters at the tactical edge is critical.

Refine requirements, develop use cases

This task order proof of concept with the Marines is part of the DoD chief information officer’s effort to find technology capabilities that provide seamless operations in denied, degraded, intermittent and limited bandwidth environments.

In 2021, the DoD CIO designated the Department of Navy CIO as the executive agent to lead a cross-service joint working group focused on DDIL.

“These low bandwidth and high latency conditions are prevalent at the tactical edge and experience regular disconnects from the broader network, including cloud services, often for substantial periods of time,” the DON CIO’s office wrote in late 2021. “Network server software and hardware exist at the tactical edge to provide critical IT services and data in these DDIL environments, along with a variety of spectrum communications and unclassified and classified network transports leveraging satellite links and low-Earth Orbit (LEO), Wi-Fi, cellular/4G LTE, millimeter wave/5G and others.”

The working group is leaning on industry for help in refining DoD requirements and use cases to develop standardized architectures and capabilities in these austere environments.

“These tools operate as a hybrid capability, which will allow users access to the full feature set when cloud connectivity is available, but remain productive locally within the DDIL environment,” the DON CIO wrote.

Matney said GDIT is currently supporting multiple agencies across the DoD, civilian, and intelligence sectors with on-premises collaborative capabilities that may be considered and tested as potential DDIL approaches.

The challenge that the Marines are trying to solve isn’t just a Marines or DoD challenge. It’s one nearly every agency from the departments of Treasury to Homeland Security to Justice face. And with so much dependency on email communication and collaboration tools, having access no matter the network environment is critical.

]]>
https://federalnewsnetwork.com/reporters-notebook-jason-miller/2022/06/marines-aim-to-solve-the-ddil-challenge/feed/ 0
Army’s 2023 IT, cyber budget request aims to push digital transformation further, faster https://federalnewsnetwork.com/army/2022/06/armys-2023-it-cyber-budget-request-aims-to-push-digital-transformation-further-faster/ https://federalnewsnetwork.com/army/2022/06/armys-2023-it-cyber-budget-request-aims-to-push-digital-transformation-further-faster/#respond Mon, 13 Jun 2022 14:38:06 +0000 https://federalnewsnetwork.com/?p=4100010 var config_4097573 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/dts.podtrac.com\/redirect.mp3\/pdst.fm\/e\/chrt.fm\/track\/E2G895\/aw.noxsolutions.com\/launchpod\/federal-drive\/mp3\/061022_Jason_web_bn4k_178813a3.mp3?awCollectionId=1146&awEpisodeId=f72fc79f-f20e-4a1b-bb5d-d615178813a3&awNetwork=322"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2018\/12\/FD1500-150x150.jpg","title":"Army hoping budget request for digital transformation will come through","description":"[hbidcpodcast podcastid='4097573']nn<em>Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive\u2019s daily audio interviews on\u00a0<\/em><a href="https:\/\/itunes.apple.com\/us\/podcast\/federal-drive-with-tom-temin\/id1270799277?mt=2"><i>Apple Podcasts<\/i><\/a><em>\u00a0or\u00a0<a href="https:\/\/www.podcastone.com\/federal-drive-with-tom-temin?pid=1753589">PodcastOne<\/a>.<\/em>nnThe Army is expecting fiscal 2023 to be a big year for its digital transformation efforts.nnThe question, as with most agency programs, is whether Congress will deliver on the Army\u2019s budget request.nn[caption id="attachment_4072886" align="alignright" width="300"]<img class="size-medium wp-image-4072886" src="https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2022\/05\/IMG_1678-300x224.jpg" alt="" width="300" height="224" \/> Dr. Raj Iyer is the Army's CIO.[\/caption]nn\u201cFiscal 2023, for us, is that year of inflection when it comes to our digital transformation journey,\u201d said Raj Iyer, the Army\u2019s chief information officer, during a press briefing on June 9. \u201cWe need to make sure that the investments that we have are appropriately aligned to the Army's priorities and to the Defense Department priorities, quite honestly through the release of the National Defense Strategy.\u201dnnThe Army will have to support those priorities, as Iyer and Lt. Gen. John Morrison, the Army\u2019s G6, laid them out in the October <a href="https:\/\/federalnewsnetwork.com\/defense-main\/2021\/10\/armys-new-digital-strategy-looks-well-beyond-nuts-and-bolts-of-it-modernization\/">digital transformation strategy<\/a>, through mostly a flat budget request and buttressed by savings from IT modernization efforts.nnIyer said the Army\u2019s IT and cybersecurity budget request is $16.6 billion in 2023, which is the largest of all DoD services. The request makes up slightly less than 10% of the <a href="https:\/\/federalnewsnetwork.com\/army\/2022\/03\/armys-2023-budget-will-remain-relatively-flat-temporarily-shrink-end-strength\/">Army\u2019s total budget<\/a> request of $180 billion.nnThis was the first time the Army detailed its 2023 IT and cyber budget request since President Joe Biden sent his <a href="https:\/\/federalnewsnetwork.com\/defense-main\/2022\/03\/dod-budget-contains-big-pay-raise-and-largest-research-investment-ever\/">spending wish list<\/a> to Congress in March.nnInside that $16.6 billion request is $2 billion for cybersecurity, including offensive and defensive operations, network operations and research and development.nnIyer said the bulk of their IT and cyber investment will go to network support and modernization.nn\u201cThis is about $9.8 billion. This is clearly supporting all the way from the tactical edge, including the support of current operations, all the way to investments we're making the cloud,\u201d Iyer said. \u201cGen. Morrison has spoken about the unified network at a number of events\u2026but 2023 really is also our opportunity to scale our cloud efforts that we have made some tremendous progress in 2021 and 2022. We're seeing about a $290 million investment in cloud in 2023 to continue to further our cloud migration journey. There's some tremendous activity across the Army right now in terms of operationalizing the cloud that we've established in what we call c-Army.\u201dn<h2>Spending less on legacy technology<\/h2>nThe Army also is asking for about $220 million for artificial intelligence and data related initiatives.nnIyer said he expects his operations and maintenance (O&M) budget to support current and legacy IT to be slightly lower in 2023 compared to this year as well.nnOver the last few years, the Army has been on a path to reduce reliance on old technology and consolidate tools.nnIyer said by moving to Office 365 and through other consolidation efforts, his office has found money to reinvest in modernization.nn\u201cOne is the convergence of our networks as part of our unified network strategy. We're looking at converging 42 networks across the Army into that single, unified network. What that will do really is start to consolidate all of the tools into a common service catalog, get us to a common set of processes and to standardization across the network. That will inherently result in in cost savings into 2023,\u201d Iyer said. \u201cBeyond that, there's some other things that we have done in terms of reducing our bills for 2023 based on our current spending. One of them really is the recent decision to complete or finish out our enterprise IT-as-a-service pilots. We had three contracts in place at three pilot locations that we had selected. Most recently, based on the results and the lessons learned, we have come to a conclusion that we have good data in order to be able to deploy some common services that are cloud enabled across all Army locations worldwide.\u201dnnHe said O365 is one example of those services. He said his office deploy a standard virtual desktop infrastructure as well in the coming years.nnMorrison added that some of the decisions to find savings means getting rid of technology altogether like video teleconferencing hardware and software since the Army can use the capabilities through O365.nn\u201cAs enterprise capabilities come online, we just need to be ruthless in our governance of it to make sure that we did keep the best of legacy capabilities and we don't hang on to something just to hang on to it,\u201d Morrison said. \u201cI think we are putting the mechanisms in place to really start getting after that. We're continuing to shut data centers. We are continuing to leverage the great capabilities that come with c-Army. We're not doing it at the speed and tempo that we probably can. And quite frankly, Dr. Iyer and I had a discussion just earlier [on June 9] about reinvigorating those efforts because even though we're past what the goals that had been set for the Army from an efficiency perspective, but I would submit to you more importantly from an operational effectiveness perspective, we need to move a little bit faster and harmonize with this hybrid cloud operational environments. That will only drive us faster toward data centricity. It will only drive us toward a unified network that can support multi-domain operations.\u201dn<h2>5 ERPs, 150 support systems<\/h2>nOne of Iyer\u2019s biggest and boldest priorities for 2023 that, over the long term, should result in significant cost savings is modernizing the Army\u2019s business systems.nnThe Army plans to spend $1.4 billion on maintaining five enterprise resource planning (ERP) systems for financial management, human resources and the like as well as 150 support systems.nnIyer said many of these ERP and related systems are more than 20 years old and ready to be updated and moved the cloud.nnThe Army has been focused on reducing and modernizing its business systems for the last decade. In 2017, the service reported it <a href="https:\/\/federalnewsnetwork.com\/dod-reporters-notebook-jared-serbu\/2017\/04\/army-plans-cut-number-business-systems-half\/">cut the number<\/a> of business systems to 400 from 800. In 2020, it upgraded the <a href="https:\/\/federalnewsnetwork.com\/it-modernization\/2020\/01\/army-va-taking-on-major-enterprise-financial-system-transformation-projects-next-year\/">General Fund Enterprise Business System<\/a> to be more of a shared service for other Defense agencies and had plans to take the system to the cloud.nn\u201cOur marquee effort in 2023 is going to be our implementation or initial prototyping for our new enterprise business systems convergence,\u201d he said. \u201cWe're trying to converge them into a single architecture or into a single system if we can. If we have one integrated capability, then, more importantly, the data that we can pass across that spectrum of operations for analytics. It is a massive, multi-year modernization effort. We fully expect that it will be as high as 10 years for us to get to that modernization effort. But the approach that we're taking isn't a big bang approach that we've typically used in the past. This is going to meet be more of an evolutionary modernization approach.\u201dnnAlong with the budget request, Iyer said the Army will take another key step this summer when the Program Executive Office-Enterprise Information Systems (PEO-EIS) will release a call for white papers under an other transaction agreement (OTA) approach to better understand what industry has to offer.nn\u201cSince we are going to use an OTA process for this acquisition, there is going to be a lot of interaction with the industry to figure out what's out there that the Army can adopt rapidly, as well as, ensure we have a future proof architecture,\u201d Iyer said. \u201cWe expect to award multiple prototypes in early 2023. These would run anywhere from 12-to-18 months. Then at the end of that effort, just like any OTA, we will get to a production contract by down selecting one of those prototypes to be our production solution.\u201dnnIyer said while the OTA will not be prescriptive, the Army wants to see how industry responds with ideas that include using a modular architecture, supports data exchange through application programming interfaces (APIs) and micro services and is cloud native.nn\u201cWe'll be looking at how flexible the solution will be in terms of its ability to implement Army unique processes wherever we have them, without the need to customize commercial off the shelf products,\u201d he said. \u201cThis is going to be evolutionary modernization as we are doing this in an agile approach. We will let the functional priorities define what those increments will be, and then we will look at the risk profile to look at how quickly we can get those turned on. That will determine the level of funding and the timeline for implementation. From an implementation perspective, one of the things that we're going to we're going to be pushing an industry for is to truly do this using dev\/sec\/ops and in an agile manner, which means that we are looking for functionality to be available or released to users on rapid sprints, not taking years to do this. This is all about getting functionality in the hands of the user rapidly through agile development.\u201d"}};

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

The Army is expecting fiscal 2023 to be a big year for its digital transformation efforts.

The question, as with most agency programs, is whether Congress will deliver on the Army’s budget request.

Dr. Raj Iyer is the Army’s CIO.

“Fiscal 2023, for us, is that year of inflection when it comes to our digital transformation journey,” said Raj Iyer, the Army’s chief information officer, during a press briefing on June 9. “We need to make sure that the investments that we have are appropriately aligned to the Army’s priorities and to the Defense Department priorities, quite honestly through the release of the National Defense Strategy.”

The Army will have to support those priorities, as Iyer and Lt. Gen. John Morrison, the Army’s G6, laid them out in the October digital transformation strategy, through mostly a flat budget request and buttressed by savings from IT modernization efforts.

Iyer said the Army’s IT and cybersecurity budget request is $16.6 billion in 2023, which is the largest of all DoD services. The request makes up slightly less than 10% of the Army’s total budget request of $180 billion.

This was the first time the Army detailed its 2023 IT and cyber budget request since President Joe Biden sent his spending wish list to Congress in March.

Inside that $16.6 billion request is $2 billion for cybersecurity, including offensive and defensive operations, network operations and research and development.

Iyer said the bulk of their IT and cyber investment will go to network support and modernization.

“This is about $9.8 billion. This is clearly supporting all the way from the tactical edge, including the support of current operations, all the way to investments we’re making the cloud,” Iyer said. “Gen. Morrison has spoken about the unified network at a number of events…but 2023 really is also our opportunity to scale our cloud efforts that we have made some tremendous progress in 2021 and 2022. We’re seeing about a $290 million investment in cloud in 2023 to continue to further our cloud migration journey. There’s some tremendous activity across the Army right now in terms of operationalizing the cloud that we’ve established in what we call c-Army.”

Spending less on legacy technology

The Army also is asking for about $220 million for artificial intelligence and data related initiatives.

Iyer said he expects his operations and maintenance (O&M) budget to support current and legacy IT to be slightly lower in 2023 compared to this year as well.

Over the last few years, the Army has been on a path to reduce reliance on old technology and consolidate tools.

Iyer said by moving to Office 365 and through other consolidation efforts, his office has found money to reinvest in modernization.

“One is the convergence of our networks as part of our unified network strategy. We’re looking at converging 42 networks across the Army into that single, unified network. What that will do really is start to consolidate all of the tools into a common service catalog, get us to a common set of processes and to standardization across the network. That will inherently result in in cost savings into 2023,” Iyer said. “Beyond that, there’s some other things that we have done in terms of reducing our bills for 2023 based on our current spending. One of them really is the recent decision to complete or finish out our enterprise IT-as-a-service pilots. We had three contracts in place at three pilot locations that we had selected. Most recently, based on the results and the lessons learned, we have come to a conclusion that we have good data in order to be able to deploy some common services that are cloud enabled across all Army locations worldwide.”

He said O365 is one example of those services. He said his office deploy a standard virtual desktop infrastructure as well in the coming years.

Morrison added that some of the decisions to find savings means getting rid of technology altogether like video teleconferencing hardware and software since the Army can use the capabilities through O365.

“As enterprise capabilities come online, we just need to be ruthless in our governance of it to make sure that we did keep the best of legacy capabilities and we don’t hang on to something just to hang on to it,” Morrison said. “I think we are putting the mechanisms in place to really start getting after that. We’re continuing to shut data centers. We are continuing to leverage the great capabilities that come with c-Army. We’re not doing it at the speed and tempo that we probably can. And quite frankly, Dr. Iyer and I had a discussion just earlier [on June 9] about reinvigorating those efforts because even though we’re past what the goals that had been set for the Army from an efficiency perspective, but I would submit to you more importantly from an operational effectiveness perspective, we need to move a little bit faster and harmonize with this hybrid cloud operational environments. That will only drive us faster toward data centricity. It will only drive us toward a unified network that can support multi-domain operations.”

5 ERPs, 150 support systems

One of Iyer’s biggest and boldest priorities for 2023 that, over the long term, should result in significant cost savings is modernizing the Army’s business systems.

The Army plans to spend $1.4 billion on maintaining five enterprise resource planning (ERP) systems for financial management, human resources and the like as well as 150 support systems.

Iyer said many of these ERP and related systems are more than 20 years old and ready to be updated and moved the cloud.

The Army has been focused on reducing and modernizing its business systems for the last decade. In 2017, the service reported it cut the number of business systems to 400 from 800. In 2020, it upgraded the General Fund Enterprise Business System to be more of a shared service for other Defense agencies and had plans to take the system to the cloud.

“Our marquee effort in 2023 is going to be our implementation or initial prototyping for our new enterprise business systems convergence,” he said. “We’re trying to converge them into a single architecture or into a single system if we can. If we have one integrated capability, then, more importantly, the data that we can pass across that spectrum of operations for analytics. It is a massive, multi-year modernization effort. We fully expect that it will be as high as 10 years for us to get to that modernization effort. But the approach that we’re taking isn’t a big bang approach that we’ve typically used in the past. This is going to meet be more of an evolutionary modernization approach.”

Along with the budget request, Iyer said the Army will take another key step this summer when the Program Executive Office-Enterprise Information Systems (PEO-EIS) will release a call for white papers under an other transaction agreement (OTA) approach to better understand what industry has to offer.

“Since we are going to use an OTA process for this acquisition, there is going to be a lot of interaction with the industry to figure out what’s out there that the Army can adopt rapidly, as well as, ensure we have a future proof architecture,” Iyer said. “We expect to award multiple prototypes in early 2023. These would run anywhere from 12-to-18 months. Then at the end of that effort, just like any OTA, we will get to a production contract by down selecting one of those prototypes to be our production solution.”

Iyer said while the OTA will not be prescriptive, the Army wants to see how industry responds with ideas that include using a modular architecture, supports data exchange through application programming interfaces (APIs) and micro services and is cloud native.

“We’ll be looking at how flexible the solution will be in terms of its ability to implement Army unique processes wherever we have them, without the need to customize commercial off the shelf products,” he said. “This is going to be evolutionary modernization as we are doing this in an agile approach. We will let the functional priorities define what those increments will be, and then we will look at the risk profile to look at how quickly we can get those turned on. That will determine the level of funding and the timeline for implementation. From an implementation perspective, one of the things that we’re going to we’re going to be pushing an industry for is to truly do this using dev/sec/ops and in an agile manner, which means that we are looking for functionality to be available or released to users on rapid sprints, not taking years to do this. This is all about getting functionality in the hands of the user rapidly through agile development.”

]]>
https://federalnewsnetwork.com/army/2022/06/armys-2023-it-cyber-budget-request-aims-to-push-digital-transformation-further-faster/feed/ 0
How the Defense Department will modernize its software, and what that means to contractors https://federalnewsnetwork.com/defense-main/2022/06/how-the-defense-department-will-modernize-its-software-and-what-that-means-to-contractors/ https://federalnewsnetwork.com/defense-main/2022/06/how-the-defense-department-will-modernize-its-software-and-what-that-means-to-contractors/#respond Tue, 07 Jun 2022 17:12:56 +0000 https://federalnewsnetwork.com/?p=4092299 var config_4092125 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/dts.podtrac.com\/redirect.mp3\/pdst.fm\/e\/chrt.fm\/track\/E2G895\/aw.noxsolutions.com\/launchpod\/federal-drive\/mp3\/060722_Curran_web_2q95_ab82bb0a.mp3?awCollectionId=1146&awEpisodeId=2af84129-a231-4ba8-b600-63faab82bb0a&adwNewID3=true&awNetwork=322"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2018\/12\/FD1500-150x150.jpg","title":"How the Defense Department will modernize its software, and what that means to contractors","description":"[hbidcpodcast podcastid='4092125']nn<em>Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive\u2019s daily audio interviews on\u00a0<\/em><a href="https:\/\/itunes.apple.com\/us\/podcast\/federal-drive-with-tom-temin\/id1270799277?mt=2"><em><span style="color: #0070c0;">Apple Podcast<\/span><\/em><span style="color: #0070c0;">s<\/span><\/a><em>\u00a0or\u00a0<a href="https:\/\/www.podcastone.com\/federal-drive-with-tom-temin?pid=1753589"><span style="font-style: normal;">PodcastOne<\/span><\/a>.<\/em>nnIt's an eternal question. How to grow your business in a mature market with lots of established players. The Defense software market is as mature as any, and yet the DoD has a pervasive need to modernize its software to take into account cloud computing, the need to refresh the military strategic offset, and a host of other reasons. For the big picture of what's affecting the DoD software market, the\u00a0<a href="https:\/\/federalnewsnetwork.com\/category\/temin\/tom-temin-federal-drive\/"><strong><em>Federal Drive with Tom Temin<\/em><\/strong><\/a> turned to Frost and Sullivan industry analyst Brad Curran.nn<em>Interview transcript:<\/em>n<blockquote><strong>Tom Temin: <\/strong>And even though it is a mature market, there is this modernization drive. So how would you characterize the market maybe starting with a sense of the dollars involved in software acquisition?nn<strong>Brad Curran: <\/strong>Just four new prime contracts in the last year, there's been over $5 billion on the unclassified side. And so that includes both the traditional more proprietary software to keep combat systems updated and operational. And also bringing in commercial technologies, cloud computing as a service functions and things like that with more traditional commercial software companies. So it's a large market and mature as you stated, it's still growing.nn<strong>Tom Temin: <\/strong>And you list in this latest report a number of factors that are affecting the way that it's buying software and the types of services it's buying. There's some strategic influences on software, run through some of those for us.nn<strong>Brad Curran: <\/strong>That's right. It's all about information sharing and collaboration. So on the tactical side, it's, you know, that desire to tighten that sensor to shooter kill chain, on the enterprise side is to collaborate better, so that resources are better utilized. And, you know, different organizations can share information quickly and easily. And of course, with the intelligence community where there are a huge volumes of data, it's very important, both to be able to find what you're looking for, do some analysis, now with the help of artificial intelligence tools, and most importantly, share that information on a timely basis.nn<strong>Tom Temin: <\/strong>And there seem to be two, not contradictory, but parallel tracks going on here. And one is, as you mentioned, DoD seeking a new type of commercially available software, a lot of this is in the cloud software as a service. But at the same time, you mentioned the Army has established a Software Factory in Austin, Texas, and you've got AFWERX, and you've got other, maybe there's three things going on one standard commercial offerings that they're adopting, two developing through these kind of factories, DevSecOps, and three looking for innovators that are not the traditional vendors, and maybe explain how that all ties in and what vendors have a shot at some of this business.nn<strong>Brad Curran: <\/strong>That's right, you mentioned the army futures command and the Defense Innovation Unit, you know, all looking to kind of find that next Palantir, so to speak, and find those innovative companies that can really have an operational impact almost immediately. So it's good for companies to participate in those activities. You also have very large companies, Microsoft, and Dell and Amazon and, Oracle and VMware and the rest, you know, that are providing the same types of services that large commercial organizations need just to run day to day operations. And then the tactical side, you have to have a survivable network, you have to have a network that can survive in all kinds of environments, both physically and if there's hostile electronic warfare or cyber attacks going on. So all of those areas are proceeding simultaneously. And all are very important and some crossover as well.nn<strong>Tom Temin: <\/strong>And in looking at the contract funding by department, you have the three major armed services, Navy, Marine Corps of those three is the largest, but the largest software market of all is joint service. That's almost half of the total addressable market there. Is that mostly JADC2, or what else is going on in that large, almost half the market that's joint?nn<strong>Brad Curran: <\/strong>JADC2 in the last couple of years has come on strong, you know, with billion dollar contracts being let that is one of the big reasons, but also, the DoD CIO and the joint artificial intelligence organization all have a lot more influence than they used to, in an effort to kind of have standard software across all the services again, both to save money and to improve operations and information sharing. So we seen the joint commands and the joint organizations have a lot more influence in setting standards. And in sort of twisting the arm a little bit of the service commanders to get more standardized software and share information.nn<strong>Tom Temin: <\/strong>We were speaking with Brad Curran. He's an industry analyst with Frost and Sullivan. And recently we had a story here about a very small startup company, about 15 people that landed a really large IDIQ with the Air Force pursuant to the Air Force's JADC2 10 years, but about a billion dollars over 10 years, for their part of JADC2 at the Air Force. What is the best way for emergent or contractors new to the defense market that are willing to overcome what it takes to get a contract? What's the best advice for them to grab hold of this changing and shifting market we've been describing?nn<strong>Brad Curran: <\/strong>Yeah, as we mentioned, start out with Defense Innovation Unit, with Army Futures Command with the Air Force Research Lab, Air Force Works, you know, and a lot of those contracts at first can be very, very small. But once you show you know that you're making a good contribution, and that the software can be integrated across several different networks or platforms, you have a much better chance of involvement. And then the other side is stick with the traditional companies, Lockheed Martin, Northrop Grumman, Raytheon, General Dynamics, the other large, traditional defense firms partner with them, because you know, they're looking to improve the software for the systems that they have. And in many cases, they're able to bring in some commercial solutions. And then the third leg is, you know, the big commercial companies, Microsoft, Amazon, Oracle, and the rest. They have mature established products. But they also need assistance from time to time, especially with engineering and integration, because they're not really up to date on some of the nuances of the traditional defense market and the requirements and unique requirements of DoD, not just security, but for, you know, organization and cultural issues as well. And, of course, across the board, firms that can provide artificial intelligence upgrades, cloud computing, processing, exploitation, and dissemination tools for the intelligence community, all of those sorts of things are in high demand.nn<strong>Tom Temin: <\/strong>And if you go the route of save through AFWERX, or through the Defense Innovation Unit, it's probably a little less friction to get your own direct contract, because it would likely be through the other transaction authority.nn<strong>Brad Curran: <\/strong>That's absolutely correct. And not only that, they've come up with other contract vehicles as well. They've gotten, you know, the Silicon Valley and Austin and Boston technology communities involved and come up with additional, small, but non-traditional contracting vehicles, but the corner has been turned, you know, DoD is open to these types of things, now. They realize to stay competitive with our near peers, and against non-state actors, because of cybersecurity and, and other problems. You know, they need to again, innovate and adopt software at the speed of relevance. And to do that, you know, they have to engage more with the commercial software companies. And as you mentioned, DoD is also training their own uniform people as well. You know, the thought behind it is, once they're out in the field, and they come up against an unexpected problem, either because of a software glitch that needs to be patched, or because of adversary activity. They want to have uniform people in the field that can react quickly, and make adjustments to that software, and not have to wait and go back to the factory, so to speak.nn<strong>Tom Temin: <\/strong>Sure. So the grandfather's in uniform programmed things in Ada, then there was a whole generation of totally outsourced software. Now the grandsons in uniform are programming in Java.nn<strong>Brad Curran: <\/strong>Right, exactly. And in the last decade or so, you know, we've realized that with an army of contractors required to set up IT networks in the field that won't go away we'll still need that services and support. But it can't depend completely on contractors once we're out in the field.nn<strong>Tom Temin: <\/strong>And just a final question, how much legacy code is there to be converted, factored, modernized? Whatever. I mean, there was some very ancient languages that existed up until the 2000s. Is any of that still left?nn<strong>Brad Curran: <\/strong>There's still a lot of legacy code. So those types of services are invaluable, and most importantly, the engineering and integration services to be able to get those legacy systems to talk with more modern and up to date programs, especially for one off isolated weapons systems or surveillance systems that are very unique and vital, but still have to be upgraded and become more resilient both against adversary activity but also to enable more sharing of the data.<\/blockquote>"}};

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

It’s an eternal question. How to grow your business in a mature market with lots of established players. The Defense software market is as mature as any, and yet the DoD has a pervasive need to modernize its software to take into account cloud computing, the need to refresh the military strategic offset, and a host of other reasons. For the big picture of what’s affecting the DoD software market, the Federal Drive with Tom Temin turned to Frost and Sullivan industry analyst Brad Curran.

Interview transcript:

Tom Temin: And even though it is a mature market, there is this modernization drive. So how would you characterize the market maybe starting with a sense of the dollars involved in software acquisition?

Brad Curran: Just four new prime contracts in the last year, there’s been over $5 billion on the unclassified side. And so that includes both the traditional more proprietary software to keep combat systems updated and operational. And also bringing in commercial technologies, cloud computing as a service functions and things like that with more traditional commercial software companies. So it’s a large market and mature as you stated, it’s still growing.

Tom Temin: And you list in this latest report a number of factors that are affecting the way that it’s buying software and the types of services it’s buying. There’s some strategic influences on software, run through some of those for us.

Brad Curran: That’s right. It’s all about information sharing and collaboration. So on the tactical side, it’s, you know, that desire to tighten that sensor to shooter kill chain, on the enterprise side is to collaborate better, so that resources are better utilized. And, you know, different organizations can share information quickly and easily. And of course, with the intelligence community where there are a huge volumes of data, it’s very important, both to be able to find what you’re looking for, do some analysis, now with the help of artificial intelligence tools, and most importantly, share that information on a timely basis.

Tom Temin: And there seem to be two, not contradictory, but parallel tracks going on here. And one is, as you mentioned, DoD seeking a new type of commercially available software, a lot of this is in the cloud software as a service. But at the same time, you mentioned the Army has established a Software Factory in Austin, Texas, and you’ve got AFWERX, and you’ve got other, maybe there’s three things going on one standard commercial offerings that they’re adopting, two developing through these kind of factories, DevSecOps, and three looking for innovators that are not the traditional vendors, and maybe explain how that all ties in and what vendors have a shot at some of this business.

Brad Curran: That’s right, you mentioned the army futures command and the Defense Innovation Unit, you know, all looking to kind of find that next Palantir, so to speak, and find those innovative companies that can really have an operational impact almost immediately. So it’s good for companies to participate in those activities. You also have very large companies, Microsoft, and Dell and Amazon and, Oracle and VMware and the rest, you know, that are providing the same types of services that large commercial organizations need just to run day to day operations. And then the tactical side, you have to have a survivable network, you have to have a network that can survive in all kinds of environments, both physically and if there’s hostile electronic warfare or cyber attacks going on. So all of those areas are proceeding simultaneously. And all are very important and some crossover as well.

Tom Temin: And in looking at the contract funding by department, you have the three major armed services, Navy, Marine Corps of those three is the largest, but the largest software market of all is joint service. That’s almost half of the total addressable market there. Is that mostly JADC2, or what else is going on in that large, almost half the market that’s joint?

Brad Curran: JADC2 in the last couple of years has come on strong, you know, with billion dollar contracts being let that is one of the big reasons, but also, the DoD CIO and the joint artificial intelligence organization all have a lot more influence than they used to, in an effort to kind of have standard software across all the services again, both to save money and to improve operations and information sharing. So we seen the joint commands and the joint organizations have a lot more influence in setting standards. And in sort of twisting the arm a little bit of the service commanders to get more standardized software and share information.

Tom Temin: We were speaking with Brad Curran. He’s an industry analyst with Frost and Sullivan. And recently we had a story here about a very small startup company, about 15 people that landed a really large IDIQ with the Air Force pursuant to the Air Force’s JADC2 10 years, but about a billion dollars over 10 years, for their part of JADC2 at the Air Force. What is the best way for emergent or contractors new to the defense market that are willing to overcome what it takes to get a contract? What’s the best advice for them to grab hold of this changing and shifting market we’ve been describing?

Brad Curran: Yeah, as we mentioned, start out with Defense Innovation Unit, with Army Futures Command with the Air Force Research Lab, Air Force Works, you know, and a lot of those contracts at first can be very, very small. But once you show you know that you’re making a good contribution, and that the software can be integrated across several different networks or platforms, you have a much better chance of involvement. And then the other side is stick with the traditional companies, Lockheed Martin, Northrop Grumman, Raytheon, General Dynamics, the other large, traditional defense firms partner with them, because you know, they’re looking to improve the software for the systems that they have. And in many cases, they’re able to bring in some commercial solutions. And then the third leg is, you know, the big commercial companies, Microsoft, Amazon, Oracle, and the rest. They have mature established products. But they also need assistance from time to time, especially with engineering and integration, because they’re not really up to date on some of the nuances of the traditional defense market and the requirements and unique requirements of DoD, not just security, but for, you know, organization and cultural issues as well. And, of course, across the board, firms that can provide artificial intelligence upgrades, cloud computing, processing, exploitation, and dissemination tools for the intelligence community, all of those sorts of things are in high demand.

Tom Temin: And if you go the route of save through AFWERX, or through the Defense Innovation Unit, it’s probably a little less friction to get your own direct contract, because it would likely be through the other transaction authority.

Brad Curran: That’s absolutely correct. And not only that, they’ve come up with other contract vehicles as well. They’ve gotten, you know, the Silicon Valley and Austin and Boston technology communities involved and come up with additional, small, but non-traditional contracting vehicles, but the corner has been turned, you know, DoD is open to these types of things, now. They realize to stay competitive with our near peers, and against non-state actors, because of cybersecurity and, and other problems. You know, they need to again, innovate and adopt software at the speed of relevance. And to do that, you know, they have to engage more with the commercial software companies. And as you mentioned, DoD is also training their own uniform people as well. You know, the thought behind it is, once they’re out in the field, and they come up against an unexpected problem, either because of a software glitch that needs to be patched, or because of adversary activity. They want to have uniform people in the field that can react quickly, and make adjustments to that software, and not have to wait and go back to the factory, so to speak.

Tom Temin: Sure. So the grandfather’s in uniform programmed things in Ada, then there was a whole generation of totally outsourced software. Now the grandsons in uniform are programming in Java.

Brad Curran: Right, exactly. And in the last decade or so, you know, we’ve realized that with an army of contractors required to set up IT networks in the field that won’t go away we’ll still need that services and support. But it can’t depend completely on contractors once we’re out in the field.

Tom Temin: And just a final question, how much legacy code is there to be converted, factored, modernized? Whatever. I mean, there was some very ancient languages that existed up until the 2000s. Is any of that still left?

Brad Curran: There’s still a lot of legacy code. So those types of services are invaluable, and most importantly, the engineering and integration services to be able to get those legacy systems to talk with more modern and up to date programs, especially for one off isolated weapons systems or surveillance systems that are very unique and vital, but still have to be upgraded and become more resilient both against adversary activity but also to enable more sharing of the data.

]]>
https://federalnewsnetwork.com/defense-main/2022/06/how-the-defense-department-will-modernize-its-software-and-what-that-means-to-contractors/feed/ 0
DISA moves 95 applications out of the sunsetting milCloud 2.0 platform https://federalnewsnetwork.com/defense-news/2022/06/disa-moves-95-applications-out-of-the-sunsetting-milcloud-2-0-platform/ https://federalnewsnetwork.com/defense-news/2022/06/disa-moves-95-applications-out-of-the-sunsetting-milcloud-2-0-platform/#respond Thu, 02 Jun 2022 15:05:49 +0000 https://federalnewsnetwork.com/?p=4086051 The Defense Information Systems Agency shutdown the milCloud 2.0 platform a week ahead of schedule.

Sharon Woods, the director of the Host and Compute Center at DISA, said her team successfully transitioned 120 accounts off of milCloud 2.0 ahead of the June 8 deadline. Of the 120 accounts, 95 were on milCloud 2.0 and the rest were in a commercial cloud already and didn’t need to be transferred.

Sharon Woods is the director of the Hosting and Compute Center at DISA.

“Of the 95, 60 of those went to Stratus, DISA’s private cloud offering, 18 of them went to commercial cloud, and then there were 17 whose accounts they just let expire. They were typically research and development or sandbox environments that had already served their purpose. In total, it’s over 1,700 terabytes of data and 820 virtual machines,” Woods said in an exclusive interview with Federal News Network. “milCloud 2.0 is sunsetting and milCloud 1.0 is sunset as well. That, as a capability, no longer exists. For Stratus, we did consume some of that very basic underlying infrastructure of milCloud 1.0, but then immediately layered on a lot of new capabilities so that it became a new capability unto itself.”

DISA’s decision in December to end its milCloud relationship with GDIT was surprising and unexpected. DISA awarded a contract to CSRA in June 2017 to develop and run the commercial cloud offering. GDIT bought CSRA in April 2018 for $9.7 billion. The milCloud 2.0 contract included a three-year base with five one-year options, and it was worth as much as $498 million. This June would have been the third option period for the program.

Woods said the self-imposed six-month deadline created an urgency that required people and resources from across DISA to help meet.

“I think some of the things that helped make this work was when I say it was an all hands on deck within DISA, it wasn’t just the team that historically had worked on milCloud 2.0, we surged a lot of our personnel in order to help customers do this,” she said. “We know that this was difficult for customers to receive, this was not desired news. So DISA really made a commitment to serve as many people as possible to try and alleviate as much burden on the customers themselves. We’ve had relationships with all of these customers for years. We know their capabilities and we know their applications. With us having more resources at the table, I think that was really key in making the transition, and our customers were all so collaborative. That is also something that I think was a key driver in the success.”

There were some applications that Woods and her team weren’t sure could transition in six months. While she wouldn’t offer any specifics, she said some customers with an “enormous amount of data in their applications” took more effort, especially those that were on outdated or legacy versions of software.

“I will admit that we were a little nervous about the timing because of how much data was in certain applications,” Woods said.

Of the 18 workloads that went to a commercial cloud, it was a final step the services or Defense agencies had been preparing for over the last few years to get the applications “cloud ready.”

“I see that as a positive move, if commercial cloud is the right environment for a mission partner, then that’s where they should be to meet their requirements,” she said. “At the Hosting and Compute Center, one of our key philosophies is being an honest broker. So sometimes that means consuming an offering that we’re providing, like Stratus as a private cloud environment, but other times it may mean going to a commercial cloud provider. We have the expertise to help guide customers in the different ways they could meet their requirements and which ones might be the best for them.”

DISA refunding money still a question

One big question that emerged after DISA’s decision to sunset milCloud 2.0 was what would happen to the funding already transferred by DoD customers to pay for the services.

Woods said for the 60 workloads that went to Stratus and the 17 that sunset altogether, there were no refund questions or challenges. But for the 18 that left DISA altogether, she said her office is working through the specifics around the period of performance and funding type for each particular task order.

“They may end up returning money, but I don’t have the exact numbers. We’re still just finalizing the exact numbers, but we’ve been working on that with them since the very beginning,” she said. “We have done everything we possibly can to make sure that customers have control over their funding. And I mentioned this before, this was a surprise and unexpected announcement to the customers. So DISA searched every resource that we had to be creative and figure out how to make this work for customers.”

]]>
https://federalnewsnetwork.com/defense-news/2022/06/disa-moves-95-applications-out-of-the-sunsetting-milcloud-2-0-platform/feed/ 0
Why GSA believes its new cloud services contract is different than past efforts https://federalnewsnetwork.com/contractsawards/2022/05/why-gsa-believes-its-new-cloud-services-contract-is-different-than-past-efforts/ https://federalnewsnetwork.com/contractsawards/2022/05/why-gsa-believes-its-new-cloud-services-contract-is-different-than-past-efforts/#respond Mon, 30 May 2022 16:39:15 +0000 https://federalnewsnetwork.com/?p=4080812 var config_4082359 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/dts.podtrac.com\/redirect.mp3\/pdst.fm\/e\/chrt.fm\/track\/E2G895\/aw.noxsolutions.com\/launchpod\/federal-drive\/mp3\/053122_Jason_web_pqts_6a2aab72.mp3?awCollectionId=1146&awEpisodeId=d31c7bbc-58d2-4d96-b4c0-aba56a2aab72&adwNewID3=true&awNetwork=322"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2018\/12\/FD1500-150x150.jpg","title":"Why GSA believes its new cloud services contract is different than past efforts","description":"[hbidcpodcast podcastid='4082359']nnCAMBRIDGE, MD -- The General Services Administration is trying, once again, to remove the complexities that agencies face when buying cloud services.nnThis has been a long-standing goal across multiple administrations and multiple attempts that have struggled to gain traction across government.nnBut the <a href="https:\/\/sam.gov\/opp\/b8f273a078b54a7bbd665cf38375f0df\/view" target="_blank" rel="noopener">draft statement of work<\/a> for the latest effort, called Ascend, which GSA released last week, is the culmination of months of research and discussion with agencies and industry experts.nnSonny Hashmi, the commissioner of the Federal Acquisition Service in the GSA, said he believes <a href="https:\/\/federalnewsnetwork.com\/contracting\/2022\/04\/gsas-new-blanket-purchase-agreement-focuses-on-scalable-cloud-solutions\/">Ascend will be different<\/a> than previous attempts to create big cloud procurement vehicles.nn[caption id="attachment_3540936" align="alignright" width="300"]<img class="size-medium wp-image-3540936" src="https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2021\/07\/sonny-hashmi-300x300.jpg" alt="" width="300" height="300" \/> Sonny Hashmi is the commissioner of the Federal Acquisition Service at GSA.[\/caption]nn\u201cI don't want to make the presumption that we've figured it out. The process to get to an endpoint on Ascend is going to require a lot of dialogue, and I don't want to us to move forward without it,\u201d Hashmi said after his speech at the Emerging Technology and Innovation conference sponsored by ACT-IAC. \u201cIt goes back to how we were talking about user centric design. There's got to be a user need, and in this case, it's got to be an agency need that Ascend will address. That will dictate what the vehicle looks like how it's going to be designed because without it, it is not going to be successful.\u201dnnHashmi, who served as GSA\u2019s chief information officer and worked in industry before taking over as commissioner, is familiar with the previous attempts to create cloud vehicles. He said one reason for the lack of adoption is the vehicles didn\u2019t take an agency mission-need first approach.nn\u201cAt this point, we're being very deliberate about making sure that there is an actual need on the other side of this. Adoption is going to happen not just because it's going to be a forcing function, but because there's actually a need that we're solving. If we're not, if it turns out that we're behind and agencies don't have a need, then I would rather actually not do this,\u201d he said. \u201cWhile we're excited about this program, ultimately, its job is to solve a problem and help agencies to deliver on mission. If there's a better way or a different way to solve for the problems that we are facing, we're happy to change tactics on it.\u201dnnThe current thinking for Ascend, as outlined in the draft statement of work is to create three separate pools of vendors to deliver infrastructure- and platform-as-a-service, software-as-a-service and cloud professional services.nn\u201cThe Ascend BPA is part of the GSA\u2019s cloud marketplace vision of empowering agencies to develop and implement enterprise-level cloud acquisition strategies through a modernized and simplified approach to meet their IT and cybersecurity requirements,\u201d the draft solicitation states. \u201cThe BPA will emphasize cloud smart\/security smart objectives, and establish minimum baseline requirements for the acquisition, business, operations, reporting and technology capabilities provided by commercial cloud service providers (CSPs) and cloud-focused labor service providers that are not currently accessible under other GSA Multiple Award Schedule (MAS) or governmentwide acquisition contracts (GWACs). The Ascend BPA will focus on enabling support for both vertical (e.g., IaaS, PaaS, SaaS) and horizontal capabilities across the ecosystem and will provide more effective system integration and managed support services for the delivery of flexible, diverse, and secure cloud solutions.\u201dnnFeedback on Ascend is due to GSA by June 6.n<h2>New level of maturity to cloud buying<\/h2>n\u201cWe've been thinking about this challenge for some time, at least a year of internal deliberations. But now it's time to really get industry engaged, and that's why we released the draft work statement. We look forward to robust conversations, both from cloud service providers, services companies, system integrators and others, to really help us think about not only the purchasing method, mechanism, the methods, but really help us help shape our thinking around the future of digital transformation will look like,\u201d he said. \u201cWe're hoping this will be one mechanism, or the primary and most usable mechanism for agencies to think about when they're thinking about modernizing their digital stacks.\u201dnnMore broadly, Hashmi said, Ascend is trying to bring the \u201cnext level of maturity\u201d to agencies as they adopt cloud services.nnHe said Ascend will let agencies buy cloud services \u201cby the drink\u201d or under a consumption based model. It will let GSA on-ramp new cloud service providers as they become available as well as contract holders bring innovation to the federal sector as required and necessary.nn\u201cWe have flexibilities and ability that we haven't exercised at scale before,\u201d Hashmi said. \u201cThe other thing for me is creating a marketplace that is competitive. It can't just be a small number of highly capable cloud companies. If you don't create continuous opportunities for new companies to join the marketplace, then we have failed because this market is changing very rapidly.\u201dnnGSA has tried similar cloud acquisition vehicles previously for email-as-a-service and IaaS back in the early days of cloud buying. It found agencies didn\u2019t want just a contract to buy cloud services, but wanted the <a href="https:\/\/federalnewsnetwork.com\/ask-the-cio\/2013\/06\/gsa-to-decide-if-cloud-broker-services-work\/">full range of support<\/a> from the cloud itself to integration services to ongoing support.n<h2>Gray areas still need to be worked out<\/h2>nNearly 10 years later and as agencies spent about $8.6 billion on cloud services last year, with GSA acquisition vehicles accounting for $1.6 billion in 2020 and almost as much in 2021, Hashmi believes the time is right for this new approach.nn\u201cI think too many people talk about cloud as a thing that is different, unique and we need to buy it specially. Cloud is just part of how we modernize and how we deploy technology. So yes, when you use the Alliant vehicle to modernize, and the primary focus would be to buy professional services to help you modernize your solutions and technologies. Cloud is going to be a component of that,\u201d he said. \u201cSimilarly when you want to just buy a discrete number have licenses and it's easy and fairly straightforward for buyers to use the schedule contract. But there's a huge opportunity and challenge right now between those two extremes. There's more and more agencies who are going to multi-cloud environments. These require agencies to think about how these cloud technologies work with each other and integrate with each other, and then how to secure them. There are specialized services and it's also highly complex licensing models that are becoming challenging for agencies to procure through a straightforward vehicle like the schedule. So Ascend is designed to solve for that problem.\u201dnnHe added FAS also recognizes that Ascend may overlap or even have some gray areas with Alliant 2 or the schedule contract.nn\u201cWe do think that if we hadn't heard, loud and clear, from our customer base about the need to have a flexible, agile way to engage with cloud providers, we would not be pursuing down this road,\u201d he said. \u201cBut we've seen that over and over again, we're seeing a lot of agency-level, BPAs, which I think adds complexity and frustration to the industrial base. No company wants to bet on a Department of Commerce, BPA, Department of Homeland Security BPA and a Defense Department BPA, we do want to make sure that we also reduce friction and burden for the industry, and this is one way to do it.\u201dnnEarly versions of Ascend also have come under scrutiny by industry associations. The Coalition for Government Procurement <a href="https:\/\/federalnewsnetwork.com\/reporters-notebook-jason-miller\/2021\/10\/are-2-associations-questions-to-gsa-about-cloud-efforts-premature-or-discerning\/">wrote a letter<\/a> to GSA last fall expressing concerns that another cloud BPA will be duplicative and wasteful.nnHashmi said these concerns and other questions that have come up over the last six months is why GSA is putting out the draft statement of work. The feedback from large and small companies, from agencies, from associations and from anyone is critical to Ascend\u2019s success."}};

CAMBRIDGE, MD — The General Services Administration is trying, once again, to remove the complexities that agencies face when buying cloud services.

This has been a long-standing goal across multiple administrations and multiple attempts that have struggled to gain traction across government.

But the draft statement of work for the latest effort, called Ascend, which GSA released last week, is the culmination of months of research and discussion with agencies and industry experts.

Sonny Hashmi, the commissioner of the Federal Acquisition Service in the GSA, said he believes Ascend will be different than previous attempts to create big cloud procurement vehicles.

Sonny Hashmi is the commissioner of the Federal Acquisition Service at GSA.

“I don’t want to make the presumption that we’ve figured it out. The process to get to an endpoint on Ascend is going to require a lot of dialogue, and I don’t want to us to move forward without it,” Hashmi said after his speech at the Emerging Technology and Innovation conference sponsored by ACT-IAC. “It goes back to how we were talking about user centric design. There’s got to be a user need, and in this case, it’s got to be an agency need that Ascend will address. That will dictate what the vehicle looks like how it’s going to be designed because without it, it is not going to be successful.”

Hashmi, who served as GSA’s chief information officer and worked in industry before taking over as commissioner, is familiar with the previous attempts to create cloud vehicles. He said one reason for the lack of adoption is the vehicles didn’t take an agency mission-need first approach.

“At this point, we’re being very deliberate about making sure that there is an actual need on the other side of this. Adoption is going to happen not just because it’s going to be a forcing function, but because there’s actually a need that we’re solving. If we’re not, if it turns out that we’re behind and agencies don’t have a need, then I would rather actually not do this,” he said. “While we’re excited about this program, ultimately, its job is to solve a problem and help agencies to deliver on mission. If there’s a better way or a different way to solve for the problems that we are facing, we’re happy to change tactics on it.”

The current thinking for Ascend, as outlined in the draft statement of work is to create three separate pools of vendors to deliver infrastructure- and platform-as-a-service, software-as-a-service and cloud professional services.

“The Ascend BPA is part of the GSA’s cloud marketplace vision of empowering agencies to develop and implement enterprise-level cloud acquisition strategies through a modernized and simplified approach to meet their IT and cybersecurity requirements,” the draft solicitation states. “The BPA will emphasize cloud smart/security smart objectives, and establish minimum baseline requirements for the acquisition, business, operations, reporting and technology capabilities provided by commercial cloud service providers (CSPs) and cloud-focused labor service providers that are not currently accessible under other GSA Multiple Award Schedule (MAS) or governmentwide acquisition contracts (GWACs). The Ascend BPA will focus on enabling support for both vertical (e.g., IaaS, PaaS, SaaS) and horizontal capabilities across the ecosystem and will provide more effective system integration and managed support services for the delivery of flexible, diverse, and secure cloud solutions.”

Feedback on Ascend is due to GSA by June 6.

New level of maturity to cloud buying

“We’ve been thinking about this challenge for some time, at least a year of internal deliberations. But now it’s time to really get industry engaged, and that’s why we released the draft work statement. We look forward to robust conversations, both from cloud service providers, services companies, system integrators and others, to really help us think about not only the purchasing method, mechanism, the methods, but really help us help shape our thinking around the future of digital transformation will look like,” he said. “We’re hoping this will be one mechanism, or the primary and most usable mechanism for agencies to think about when they’re thinking about modernizing their digital stacks.”

More broadly, Hashmi said, Ascend is trying to bring the “next level of maturity” to agencies as they adopt cloud services.

He said Ascend will let agencies buy cloud services “by the drink” or under a consumption based model. It will let GSA on-ramp new cloud service providers as they become available as well as contract holders bring innovation to the federal sector as required and necessary.

“We have flexibilities and ability that we haven’t exercised at scale before,” Hashmi said. “The other thing for me is creating a marketplace that is competitive. It can’t just be a small number of highly capable cloud companies. If you don’t create continuous opportunities for new companies to join the marketplace, then we have failed because this market is changing very rapidly.”

GSA has tried similar cloud acquisition vehicles previously for email-as-a-service and IaaS back in the early days of cloud buying. It found agencies didn’t want just a contract to buy cloud services, but wanted the full range of support from the cloud itself to integration services to ongoing support.

Gray areas still need to be worked out

Nearly 10 years later and as agencies spent about $8.6 billion on cloud services last year, with GSA acquisition vehicles accounting for $1.6 billion in 2020 and almost as much in 2021, Hashmi believes the time is right for this new approach.

“I think too many people talk about cloud as a thing that is different, unique and we need to buy it specially. Cloud is just part of how we modernize and how we deploy technology. So yes, when you use the Alliant vehicle to modernize, and the primary focus would be to buy professional services to help you modernize your solutions and technologies. Cloud is going to be a component of that,” he said. “Similarly when you want to just buy a discrete number have licenses and it’s easy and fairly straightforward for buyers to use the schedule contract. But there’s a huge opportunity and challenge right now between those two extremes. There’s more and more agencies who are going to multi-cloud environments. These require agencies to think about how these cloud technologies work with each other and integrate with each other, and then how to secure them. There are specialized services and it’s also highly complex licensing models that are becoming challenging for agencies to procure through a straightforward vehicle like the schedule. So Ascend is designed to solve for that problem.”

He added FAS also recognizes that Ascend may overlap or even have some gray areas with Alliant 2 or the schedule contract.

“We do think that if we hadn’t heard, loud and clear, from our customer base about the need to have a flexible, agile way to engage with cloud providers, we would not be pursuing down this road,” he said. “But we’ve seen that over and over again, we’re seeing a lot of agency-level, BPAs, which I think adds complexity and frustration to the industrial base. No company wants to bet on a Department of Commerce, BPA, Department of Homeland Security BPA and a Defense Department BPA, we do want to make sure that we also reduce friction and burden for the industry, and this is one way to do it.”

Early versions of Ascend also have come under scrutiny by industry associations. The Coalition for Government Procurement wrote a letter to GSA last fall expressing concerns that another cloud BPA will be duplicative and wasteful.

Hashmi said these concerns and other questions that have come up over the last six months is why GSA is putting out the draft statement of work. The feedback from large and small companies, from agencies, from associations and from anyone is critical to Ascend’s success.

]]>
https://federalnewsnetwork.com/contractsawards/2022/05/why-gsa-believes-its-new-cloud-services-contract-is-different-than-past-efforts/feed/ 0
FEMA sets self-imposed deadline for moving more applications to the cloud https://federalnewsnetwork.com/ask-the-cio/2022/05/fema-sets-self-imposed-deadline-for-moving-more-applications-to-the-cloud/ https://federalnewsnetwork.com/ask-the-cio/2022/05/fema-sets-self-imposed-deadline-for-moving-more-applications-to-the-cloud/#respond Thu, 26 May 2022 21:43:51 +0000 https://federalnewsnetwork.com/?p=4077269 var config_4077562 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/dts.podtrac.com\/redirect.mp3\/pdst.fm\/e\/chrt.fm\/track\/E2G895\/aw.noxsolutions.com\/launchpod\/adswizz\/1128\/052622_askciofemapanel_web_tert_061815c4.mp3?awCollectionId=1128&awEpisodeId=f55e6982-7168-41a4-8d04-0a18061815c4&adwNewID3=true&awNetwork=322"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2018\/12\/AsktheCIO1500-150x150.jpg","title":"FEMA sets self-imposed deadline for moving more applications to the cloud","description":"[hbidcpodcast podcastid='4077562']nnFor FEMA, cloud services are a lifeline to disaster survivors.nnThere may be no better uses cases than when a hurricane or tornado strikes and FEMA must scale up its grants management or flood insurance program to tens of thousands of users in a matter of hours.nnLytwaive Hutchinson, the outgoing FEMA chief information officer, said the scalability and flexibility of cloud services along with the innovation from providers means the agency continually adapt to the changing needs of citizens.nn\u201cOur goal is to, by the end of this year, have at least 50% of all of our systems and services that are cloud ready to be moved into the cloud,\u201d Hutchinson said during a recent panel sponsored by ACT-IAC, an excerpt of which was part of the <a href="https:\/\/federalnewsnetwork.com\/category\/radio-interviews\/ask-the-cio\/">Ask the CIO<\/a> program. \u201cI've had conversations with some vendors and some folks about lift-and-shift, lift-and-shift is my last resort. That is not something that's viable. My first look is to take capability and actually either modernize them and\/or move them into the cloud because they are cloud ready or if they are not, then they should remain on-premise.\u201dnnHutchinson, who announced in March that she is <a href="https:\/\/federalnewsnetwork.com\/cio-news\/2022\/03\/femas-cio-moving-to-private-sector\/">retiring from federal service<\/a> after 41 years, said some systems are better suited to remain on FEMA or Homeland Security Department data centers, while others systems are ready today or could be ready in the short term.nn[caption id="attachment_2343136" align="alignright" width="400"]<img class="wp-image-2343136" src="https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2019\/05\/hutchinson-300x227.jpg" alt="" width="400" height="303" \/> Lytwaive Hutchinson is the CIO at FEMA until she retires in the coming weeks.[\/caption]nn\u201cWe do have upwards of, I think, 53 systems that are cloud ready so that will be 50% of 53 for this fiscal year. We have another set of systems that are not cloud ready and will have to go through a modernization phase,\u201d she said. \u201cOur goal is by fiscal 2026 to have all of our systems and services in the cloud. That is inclusive of our financial systems. We will address each of our systems on a case-by-case basis.\u201dnnShe said this <a href="https:\/\/federalnewsnetwork.com\/cme-event\/federal-insights\/ask-the-cio-department-of-homeland-security\/">IT modernization initiative<\/a> must be part of how FEMA does business every day and responds to every disaster. This means the services must be less about the greatest and latest technology and more about ensuring citizens have access to FEMA\u2019s services whether they have internet connections or not.nn\u201cOur goal is to ensure that our services do not become obsolete by just adding on a building on to current technology, but embracing new technology as that technology availed itself,\u201d Hutchinson said. \u201cYou also heard us talk a little bit about our theme for this year, which is delivering digital equities. I know it\u2019s a really nice little catchphrase, but it really does mean something to us. It is about delivering equity to our IT partners and to our citizens to be able to access being this data, not just access it, but access it securely. We also want to make sure that we are taking care of our disabled community, and that we're ensuring that our systems, our services, our websites are ready for them to also be able to utilize. We have a lot going on across FEMA as it relates to systems and services that we would like to deliver to our partners and to our citizens to be able to take advantage of the capability that FEMA brings to bear especially during the time of need in a disaster.\u201dn<h2>Securing software earlier on<\/h2>nOne way FEMA is taking on this challenge is through a \u201csecure by design\u201d approach to developing new services.nnGreg Edwards, the FEMA chief information security officer, said this is how the agency brings security closer to the acquisition process so they address potential and real vulnerabilities on the front end of the development phase.nn\u201cWe spent a lot of time in terms of zero trust with our users and thinking about how they access our services and devices in a protected and a secure manner. In that area, we've made some improvements in terms of how we control our mobile devices and made some modernization in the network and in the applications,\u201d Edwards said on the panel. \u201cIn terms of our network, we've done a heck of a lot of modernization of the assets themselves. That's all about our journey to our FEMA enterprise cloud. Then there is the data from a cyber perspective, where we are focusing very heavily on data being encrypted at-rest, and also data being encrypted in-transit.\u201dnnThe move to the cloud and the focus on zero trust is forcing FEMA to rethink more than just their internal protections, but also how the public must access the data and applications.nnEdwards said this is where the secure-by-design framework comes in.nn\u201cWhat that is going to allow us to do is closely align our system development lifecycle with the acquisition lifecycle. So step-by-step, we'll be looking at cyber activities from when you're doing some software development to when you're doing some critical design testing to when you're implementing to when you are decommissioning the system,\u201d he said. \u201cWe think this framework, secure-by-design, will be helpful to govern our overall processes and help us tighten the reins in that area.\u201dnnThrough the secure-by-design approach, Edwards said FEMA is fixing vulnerabilities faster, reducing the cost of security and improving collaboration between the technology and mission areas of the agency.n<h2>Getting governance right<\/h2>nThe biggest impact of <a href="https:\/\/federalnewsnetwork.com\/cybersecurity\/2022\/02\/officials-say-log4j-response-proves-out-promise-of-new-public-private-partnership\/">secure-by-design<\/a>, however, may be in how the system operates to serve the mission and citizens.nnEdwards said by looking at problems more holistically, FEMA can ensure changes or updates don\u2019t have downstream effects that may make one element less secure or more complex to use.nn\u201cWe're at the governance point still, and then we want to communicate the governance framework to our governance board so we get the buy-in from the whole community about the concept and methodology. We want them to have a good understanding of it before we start saying that we're actually implementing anything in that regard,\u201d he said. \u201cBut in our business, we're always working in parallel. We'll be partnering with our major programs so we do some prototyping, some understanding of some of the impacts of actually implementing this, and getting to a goal of ongoing authorization and things of that nature. While we work on the governance, we're also working with programs to prototype how this would actually work. By the end of this year, we would expect to have our governance process solidly in place, and my boss has asked me to make sure that I have about three processes that we've fully implemented by the end of this year as well.\u201dnnEdwards said there are nine processes within secure-by-design and FEMA is looking at three of them, such security planning to auditing."}};

For FEMA, cloud services are a lifeline to disaster survivors.

There may be no better uses cases than when a hurricane or tornado strikes and FEMA must scale up its grants management or flood insurance program to tens of thousands of users in a matter of hours.

Lytwaive Hutchinson, the outgoing FEMA chief information officer, said the scalability and flexibility of cloud services along with the innovation from providers means the agency continually adapt to the changing needs of citizens.

“Our goal is to, by the end of this year, have at least 50% of all of our systems and services that are cloud ready to be moved into the cloud,” Hutchinson said during a recent panel sponsored by ACT-IAC, an excerpt of which was part of the Ask the CIO program. “I’ve had conversations with some vendors and some folks about lift-and-shift, lift-and-shift is my last resort. That is not something that’s viable. My first look is to take capability and actually either modernize them and/or move them into the cloud because they are cloud ready or if they are not, then they should remain on-premise.”

Hutchinson, who announced in March that she is retiring from federal service after 41 years, said some systems are better suited to remain on FEMA or Homeland Security Department data centers, while others systems are ready today or could be ready in the short term.

Lytwaive Hutchinson is the CIO at FEMA until she retires in the coming weeks.

“We do have upwards of, I think, 53 systems that are cloud ready so that will be 50% of 53 for this fiscal year. We have another set of systems that are not cloud ready and will have to go through a modernization phase,” she said. “Our goal is by fiscal 2026 to have all of our systems and services in the cloud. That is inclusive of our financial systems. We will address each of our systems on a case-by-case basis.”

She said this IT modernization initiative must be part of how FEMA does business every day and responds to every disaster. This means the services must be less about the greatest and latest technology and more about ensuring citizens have access to FEMA’s services whether they have internet connections or not.

“Our goal is to ensure that our services do not become obsolete by just adding on a building on to current technology, but embracing new technology as that technology availed itself,” Hutchinson said. “You also heard us talk a little bit about our theme for this year, which is delivering digital equities. I know it’s a really nice little catchphrase, but it really does mean something to us. It is about delivering equity to our IT partners and to our citizens to be able to access being this data, not just access it, but access it securely. We also want to make sure that we are taking care of our disabled community, and that we’re ensuring that our systems, our services, our websites are ready for them to also be able to utilize. We have a lot going on across FEMA as it relates to systems and services that we would like to deliver to our partners and to our citizens to be able to take advantage of the capability that FEMA brings to bear especially during the time of need in a disaster.”

Securing software earlier on

One way FEMA is taking on this challenge is through a “secure by design” approach to developing new services.

Greg Edwards, the FEMA chief information security officer, said this is how the agency brings security closer to the acquisition process so they address potential and real vulnerabilities on the front end of the development phase.

“We spent a lot of time in terms of zero trust with our users and thinking about how they access our services and devices in a protected and a secure manner. In that area, we’ve made some improvements in terms of how we control our mobile devices and made some modernization in the network and in the applications,” Edwards said on the panel. “In terms of our network, we’ve done a heck of a lot of modernization of the assets themselves. That’s all about our journey to our FEMA enterprise cloud. Then there is the data from a cyber perspective, where we are focusing very heavily on data being encrypted at-rest, and also data being encrypted in-transit.”

The move to the cloud and the focus on zero trust is forcing FEMA to rethink more than just their internal protections, but also how the public must access the data and applications.

Edwards said this is where the secure-by-design framework comes in.

“What that is going to allow us to do is closely align our system development lifecycle with the acquisition lifecycle. So step-by-step, we’ll be looking at cyber activities from when you’re doing some software development to when you’re doing some critical design testing to when you’re implementing to when you are decommissioning the system,” he said. “We think this framework, secure-by-design, will be helpful to govern our overall processes and help us tighten the reins in that area.”

Through the secure-by-design approach, Edwards said FEMA is fixing vulnerabilities faster, reducing the cost of security and improving collaboration between the technology and mission areas of the agency.

Getting governance right

The biggest impact of secure-by-design, however, may be in how the system operates to serve the mission and citizens.

Edwards said by looking at problems more holistically, FEMA can ensure changes or updates don’t have downstream effects that may make one element less secure or more complex to use.

“We’re at the governance point still, and then we want to communicate the governance framework to our governance board so we get the buy-in from the whole community about the concept and methodology. We want them to have a good understanding of it before we start saying that we’re actually implementing anything in that regard,” he said. “But in our business, we’re always working in parallel. We’ll be partnering with our major programs so we do some prototyping, some understanding of some of the impacts of actually implementing this, and getting to a goal of ongoing authorization and things of that nature. While we work on the governance, we’re also working with programs to prototype how this would actually work. By the end of this year, we would expect to have our governance process solidly in place, and my boss has asked me to make sure that I have about three processes that we’ve fully implemented by the end of this year as well.”

Edwards said there are nine processes within secure-by-design and FEMA is looking at three of them, such security planning to auditing.

]]>
https://federalnewsnetwork.com/ask-the-cio/2022/05/fema-sets-self-imposed-deadline-for-moving-more-applications-to-the-cloud/feed/ 0
BYOD, app consolidation next for Army digital transformation https://federalnewsnetwork.com/army/2022/05/byod-app-consolidation-next-for-army-digital-transformation/ https://federalnewsnetwork.com/army/2022/05/byod-app-consolidation-next-for-army-digital-transformation/#respond Tue, 24 May 2022 13:22:54 +0000 https://federalnewsnetwork.com/?p=4072838 var config_4073096 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/dts.podtrac.com\/redirect.mp3\/pdst.fm\/e\/chrt.fm\/track\/E2G895\/aw.noxsolutions.com\/launchpod\/federal-drive\/mp3\/052422_Jason_web_y2vc_11833086.mp3?awCollectionId=1146&awEpisodeId=6cc6b83f-2869-4717-91f5-d15111833086&awNetwork=322"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2018\/12\/FD1500-150x150.jpg","title":"BYOD, app consolidation next for Army digital transformation","description":"[hbidcpodcast podcastid='4073096']nnCAMBRIDGE, Md. \u2014 The Army\u2019s digital transformation strategy was initially about getting the service to a unity of vision.nnSeven months after releasing <a href="https:\/\/federalnewsnetwork.com\/defense-main\/2021\/10\/armys-new-digital-strategy-looks-well-beyond-nuts-and-bolts-of-it-modernization\/">the strategy<\/a>, it\u2019s delivering new capabilities to warfighters.nnRaj Iyer, the Army\u2019s chief information officer, said the three separate implementation <a href="https:\/\/federalnewsnetwork.com\/reporters-notebook-jason-miller\/2022\/02\/armys-next-phase-of-cloud-includes-oconus-desktop-as-a-service\/">plans around cloud<\/a>, data and unifying the network are all bearing fruit to improve warfighter effectiveness and capabilities.nn[caption id="attachment_4072858" align="alignright" width="379"]<img class="wp-image-4072858 " src="https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2022\/05\/raj-iyer.jpg" alt="" width="379" height="506" \/> Raj Iyer is the Army\u2019s chief information officer. (Photo courtesy Army CIO's office)[\/caption]nn\u201cThese three efforts are now being tracked with quantifiable metrics and milestones to see what progress we're making. I\u2019m happy to report good progress are all around and all three efforts,\u201d Iyer said after his speech at the ACT-IAC Emerging Technology and Innovation Conference. \u201cWe're probably, in my opinion, about halfway there with the goals that we set out \u2026\u201dnnOne of the foundational elements of the Army\u2019s digital transformation effort is the move to Microsoft Office 365.nnIyer said almost a million soldiers and civilians moved into the Office 365 environment, which includes email, Teams and other collaboration capabilities.nn\u201cI think where we're headed now is really moving into SharePoint online. That will enable us to sunset probably about 200-plus instances of SharePoint across the Army,\u201d he said. \u201cThat is just a foundation because we are now looking at how we can leverage tools like Power BI and Power Apps that come with the O365 environment. We're integrating video conferencing into our conference rooms, leveraging O365 so that way we can divest off of some of the old video teleconferencing (VTC) equipment that we have on our conference rooms. We're integrating voice capability into O365 so you get a number that goes with your O365 accounts.\u201dnnMany of these capabilities will come to the Army a little at a time. Iyer said the service is using an agile development approach, starting many times with the National Guard and Reserve.nn\u201cWe felt was the National Guard and Reserve didn't have the right technologies in place to be able to work in a remote work environment. Before COVID they were able to go to an armory or to a reserve center to be able to check emails and do certain things. Post COVID, it became much harder for them to actually go to an office location to do things. So we're really focusing on improving the user experience for our reserve and guard,\u201d he said. \u201cEvery one of these efforts, whether it's bring-your-own-device (BYOD), or whether it's virtual desktop infrastructure, we're starting with National Guard and Reserve first, and essentially, that will be the first rollout. We start to roll it out across other parts of the Army over the next 12-to-18 months when you're going to see all of these initiatives fully implemented across the Army.\u201dnnIyer is especially excited about the BYOD initiative.nnAfter a <a href="https:\/\/federalnewsnetwork.com\/army\/2021\/12\/army-to-expand-byod-pilot-after-successful-national-guard-testing\/">successful pilot<\/a> with the National Guard and Reserve in 2021, Iyer said he is ready to implement the BYOD technology to about 20,000 users this year -- 10,000 in the guard and reserve and 10,000 across the rest of the Army.nn\u201cWe're trying to make this as global as we can to start with because you want a good idea as to you know how well it works across a broad cross section of our users around the world,\u201d he said. \u201cWe are just finishing out some of the early testing. I expect to start onboarding users probably in the next 60 days or so.\u201dnnThe challenge for the Army was finding the right technology that would be secure enough, but also respect the privacy of soldiers and civilians.nnIyer said through a service-disabled veteran-owned small business, the Army is using a BYOD technology that can work securely on \u201cunmanaged\u201d devices.nn\u201cWe found a company that were in the early stages of prototyping, but they had an implementation at Special Operations Command (SOCOM). It was, however, in a much smaller, narrower context than what the Army wants to do at scale,\u201d he said. \u201cBut we worked with the vendor to get them through the cybersecurity processes and we made sure that we did a full vulnerability testing, pen testing on it. It came back with flying colors and so we're now on a path to get that implemented across the Army.\u201d"}};

CAMBRIDGE, Md. — The Army’s digital transformation strategy was initially about getting the service to a unity of vision.

Seven months after releasing the strategy, it’s delivering new capabilities to warfighters.

Raj Iyer, the Army’s chief information officer, said the three separate implementation plans around cloud, data and unifying the network are all bearing fruit to improve warfighter effectiveness and capabilities.

Raj Iyer is the Army’s chief information officer. (Photo courtesy Army CIO’s office)

“These three efforts are now being tracked with quantifiable metrics and milestones to see what progress we’re making. I’m happy to report good progress are all around and all three efforts,” Iyer said after his speech at the ACT-IAC Emerging Technology and Innovation Conference. “We’re probably, in my opinion, about halfway there with the goals that we set out …”

One of the foundational elements of the Army’s digital transformation effort is the move to Microsoft Office 365.

Iyer said almost a million soldiers and civilians moved into the Office 365 environment, which includes email, Teams and other collaboration capabilities.

“I think where we’re headed now is really moving into SharePoint online. That will enable us to sunset probably about 200-plus instances of SharePoint across the Army,” he said. “That is just a foundation because we are now looking at how we can leverage tools like Power BI and Power Apps that come with the O365 environment. We’re integrating video conferencing into our conference rooms, leveraging O365 so that way we can divest off of some of the old video teleconferencing (VTC) equipment that we have on our conference rooms. We’re integrating voice capability into O365 so you get a number that goes with your O365 accounts.”

Many of these capabilities will come to the Army a little at a time. Iyer said the service is using an agile development approach, starting many times with the National Guard and Reserve.

“We felt was the National Guard and Reserve didn’t have the right technologies in place to be able to work in a remote work environment. Before COVID they were able to go to an armory or to a reserve center to be able to check emails and do certain things. Post COVID, it became much harder for them to actually go to an office location to do things. So we’re really focusing on improving the user experience for our reserve and guard,” he said. “Every one of these efforts, whether it’s bring-your-own-device (BYOD), or whether it’s virtual desktop infrastructure, we’re starting with National Guard and Reserve first, and essentially, that will be the first rollout. We start to roll it out across other parts of the Army over the next 12-to-18 months when you’re going to see all of these initiatives fully implemented across the Army.”

Iyer is especially excited about the BYOD initiative.

After a successful pilot with the National Guard and Reserve in 2021, Iyer said he is ready to implement the BYOD technology to about 20,000 users this year — 10,000 in the guard and reserve and 10,000 across the rest of the Army.

“We’re trying to make this as global as we can to start with because you want a good idea as to you know how well it works across a broad cross section of our users around the world,” he said. “We are just finishing out some of the early testing. I expect to start onboarding users probably in the next 60 days or so.”

The challenge for the Army was finding the right technology that would be secure enough, but also respect the privacy of soldiers and civilians.

Iyer said through a service-disabled veteran-owned small business, the Army is using a BYOD technology that can work securely on “unmanaged” devices.

“We found a company that were in the early stages of prototyping, but they had an implementation at Special Operations Command (SOCOM). It was, however, in a much smaller, narrower context than what the Army wants to do at scale,” he said. “But we worked with the vendor to get them through the cybersecurity processes and we made sure that we did a full vulnerability testing, pen testing on it. It came back with flying colors and so we’re now on a path to get that implemented across the Army.”

]]>
https://federalnewsnetwork.com/army/2022/05/byod-app-consolidation-next-for-army-digital-transformation/feed/ 0
This tiny vendor got a big Air Force contract, and defied the odds https://federalnewsnetwork.com/air-force/2022/05/this-tiny-vendor-got-a-big-air-force-contract-and-defied-the-odds/ https://federalnewsnetwork.com/air-force/2022/05/this-tiny-vendor-got-a-big-air-force-contract-and-defied-the-odds/#respond Mon, 02 May 2022 18:10:59 +0000 https://federalnewsnetwork.com/?p=4039570 var config_4039420 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/dts.podtrac.com\/redirect.mp3\/pdst.fm\/e\/chrt.fm\/track\/E2G895\/aw.noxsolutions.com\/launchpod\/federal-drive\/mp3\/050222_Fish_web_7m92_e9f5529d.mp3?awCollectionId=1146&awEpisodeId=f86eb4ab-bd30-46ab-a379-45b5e9f5529d&awNetwork=322"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2018\/12\/FD1500-150x150.jpg","title":"This tiny vendor got a big Air Force contract, and defied the odds","description":"[hbidcpodcast podcastid='4039420']nn<em>Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive\u2019s daily audio interviews on\u00a0<\/em><a href="https:\/\/itunes.apple.com\/us\/podcast\/federal-drive-with-tom-temin\/id1270799277?mt=2"><em><span style="color: #0070c0;">Apple Podcast<\/span><\/em><span style="color: #0070c0;">s<\/span><\/a><em>\u00a0or\u00a0<a href="https:\/\/www.podcastone.com\/federal-drive-with-tom-temin?pid=1753589">PodcastOne<\/a>.<\/em>nnThe company is called Ditto. Not a household name in defense contracting. But if just got an indefinite delivery, indefinite quantity contract from the Air Force, worth a potential billion dollars. Joining the <a href="https:\/\/federalnewsnetwork.com\/category\/temin\/tom-temin-federal-drive\/"><strong><em>Federal Drive with Tom Temin<\/em><\/strong><\/a> with the Air Force requirement it's hoping to meet, Ditto CEO and co-founder Adam Fish.nn<em>Interview transcript:<\/em>n<blockquote><strong>Tom Temin:<\/strong> Mr. Fish, could to have you on.nn<strong>Adam Fish: <\/strong>Thank you for having me.nn<strong>Tom Temin:<\/strong> Tell us what the problem the Air Force came to you with and awarded you this contract for.nn<strong>Adam Fish: <\/strong>We're a young company based in California as a technology startup. And the larger aspect to this contract really goes back to the JADC2 efforts, where the DoD, including the Air Force, is really trying to re-imagine their entire network system behind all their military weapons systems. And so that's at the heart of what Ditto does. Ditto is a software platform that enables devices to connect and share data in real time, even in very austere or degraded environments, which is exactly you know, what the military is operating in.nn<strong>Tom Temin:<\/strong> So it sounds like the need is to connect, say, centralized systems or even cloud systems with the increasing thing you hear about so much is edge computing, which might be the cube computers, or the self contained data centers, many, many form factors that go with military units, as you say, into those austere environments?nn<strong>Adam Fish: <\/strong>Yes, absolutely. And this has become an ever more important aspect as the DoD has kind of shifted its focus towards near peer competitors, where not only do you have to worry about the austere conditions that the military is operating in. But you also have to worry about having your communication systems broken, or jammed. And that's something that is a real possibility when thinking about near peer competitors. And so as a result, they really want to make sure that their networks are resilient to those problems. And the way to do this is actually to go back to kind of like the origins of the internet, where it was very much a peer to peer system, you know, back ARPANET, Mil Net, those early, early versions of even the commercial internet, were all based in a peer to peer system, where devices can operate semi independently. Unfortunately, in the modern era, cloud computing has kind of inverted that, where things have become centralized, where you have a central server, and if that breaks, devices can't talk to each other. And that's not going to work for the military in these environments. And so they want to get back to a system that's resilient and can work, whether it's at the edge or in the cloud, in a more peer to peer mesh manner. And our software platform that is something that helps enable applications to do that.nn<strong>Tom Temin:<\/strong> Yeah, that's always the tension over the history of computing really is the central server and the small, thin clients versus that peer to peer where the computing is distributed. That's the central issue here?nn<strong>Adam Fish: <\/strong>Exactly. And it's funny how, when you look back over the course of computing history, this has kind of come and gone in waves, where it started out as a very centralized system where mainframe computers, you know, people would log in, and it actually it was kind of like a cloud computer where you would rent time at the mainframe. But then personal computers led for it to become distributed again. And then cloud computing kind of went back to the mainframe model. What we're seeing, though, and this is in the military, but even outside of the military, is that with mobile devices, edge type computing systems, you now have the processing power to do things at the edge versus having to go back to that central system. So it's really a confluence of several factors is that the computing system operating at the edge can actually do more today. And as a result, the military wants to take advantage of that.nn<strong>Tom Temin:<\/strong> We were speaking with Adam Fish, He's CEO and co-founder of Ditto, which just got a very large Air Force contract. And just briefly, what is the technological means by which you can make sure that data is synchronized from central locations through an austere or interrupted environment to the edge computers, when in fact, the connections may not be there?nn<strong>Adam Fish: <\/strong>Great question. The core premise behind Ditto is that today, any type of application, if you go into your mobile phone, and you open up whatever your favorite app is, that application is going to talk to a central computer to get all the data that it needs. And so even if you send a message and you say, hey, I want to text message a friend, it's going to first go through a central server to go to that other person's device. And that central system is a point of failure. That's how we think about it, like from a software architecture perspective. And having points of failure in your system is risky when lives are on the line. And so that is what the military is trying to solve is how can we take the points of failure between our data communication links and remove them and create multiple ways that data can move. And so the way that that works is that devices need to be able to create arbitrary connections with each other, such as two mobile devices, if you were standing next to each other, why can't that device just transmit the data to the other device using something like Wi-Fi, or Bluetooth, a local radio versus having to use the cellular connection to go back to a server to send the data. And so that's the problem that the military has, in a big way, because they have so many different weapons systems with different radios. And so they have the capability to transmit data in different directions. But right now, those things don't really talk to each other. And so the JADC2 efforts broadly across the DoD, ABMS for the Air Force, which is what we're involved with is a way to rethink how all those weapons systems talk to each other. So they have resilient paths to transmitting data so that it doesn't get blocked.nn<strong>Tom Temin:<\/strong> And you can always fly a carrier pigeon holding a thumb drive, I guess, in the last analysis, if you have to.nn<strong>Adam Fish: <\/strong>Exactly. It's a real thing. I think, unfortunately, in a lot of situations, movement of data could actually be involved in a more physical manner where people are carrying thumb drives, hard drives, that does happen. And it's something to think about is like that might be the most relevant or safe path to move data in certain situations. But at the end of the day, the software systems need to be able to handle data moving in multiple different paths. And that's a very different approach to building software than having one that always talks to a central server.nn<strong>Tom Temin:<\/strong> And this is an IDIQ contract, which generally is for goods. So how does that work? Are they buying you copy by copy, depending on the unit that needs it? That is to say, are you offering a cloud based service, apparently not, sounds like something that is replicated locally, as traditional software licenses.nn<strong>Adam Fish: <\/strong>Yeah, it's actually a little bit of both, because at the end of the day, there is a need for cloud systems, it's not that everything can work on the edge, no one would want that there are benefits of both where you need to have the ability for data capture at edge locations and have those systems to be able to talk to each other independently. But then that data needs to get back to big powerful central systems. So you can do analysis on them. And that really would be the ideal situation, especially for ISR type missions, where you might be out flying missions capturing tons of information at the edge. And there might be some processing that happens there in real time. But then there needs to be the ability to move back that data to big systems to analyze it, and then push those insights back out. And so that whole loop of data capture to data analysis, and then action off of that, that, you know, it's commonly thought of as like the chain that the DoD is trying to shorten, so that they can act on that as quickly as possible. And then that really captures the larger JADC2 effort. And so our software has applicability across all of that. We have edge software, that would be licensed, but there is also a cloud component for the analysis part as well.nn<strong>Tom Temin:<\/strong> So the IDIQ aspect, then is that different pieces of the Air Force have the opportunity to take advantage of the contract?nn<strong>Adam Fish: <\/strong>Correct. Yeah. And it could be used for different purposes, different use cases, but broadly covering any of our software. And so there is definitely a variety of use cases that we see. For example, one would be involved with the Agile Combat Employment effort, ACE, where there is a lot of need to be able to quickly create like a remote base, perhaps in a remote area where everything has to be flown there and be carried in. And so you have to basically just like create everything very quickly. And so as a result, you need software that's very adaptable to that situation. And so that's where our edge software could really help in terms of the data capture that that remote location, but then it could be transferred back to a cloud back end for analysis.nn<strong>Tom Temin:<\/strong> Not many companies with 30 people get billion dollar potential IDIQs through the Federal Acquisition Regulations, or the DFAR. A lot of them are getting OTA deals but the sounds like a DFAR-based deal. How did you even know about the Air Force?nn<strong>Adam Fish: <\/strong>Yeah, it's actually a bit surreal. Speaking personally, when we got started, there is definitely a view in Silicon Valley of like, hey, is it a good idea to work with the government, things can take a long time working with the government. It's a little obscure, and there was part of us, our team that was a little skeptical of it just because of that history, but this was something that was really exciting for us, and we felt like it was absolutely necessary that we needed to find a way into the military, given the nature of our product and how much impact it can have. And so we actually got started with the Air Force in a program called AFWERX. And so it's a Small Business Innovation program that is designed to connect commercial companies like ditto and bring them into the government. And so that was a great pathway for us into and you know, it's continued to snowball to where we are today. But it's definitely been very exciting ride.nn<strong>Tom Temin:<\/strong> So in many ways, the government was one of your seed contracts to really get revenue into the company.nn<strong>Adam Fish: <\/strong>Correct. Yeah. And that that took a concerted effort where we wanted to make that a priority. And that's something that I'm very proud of, because I think the impact the potential here is, is really significant. And I think, you know, more broadly, more technology companies need to be paying attention. We have big challenges ahead of us as a country in thinking about Russia and China and others. And technology is going to be a key player in that. And so right now, the best technology is not being built inside the government. It's being built in commercial companies. And I'm excited that Ditto is able to take our technology and have an impact.nn<strong>Tom Temin:<\/strong> Adam Fish as CEO and co-founder of Ditto, thanks so much for joining me.nn<strong>Adam Fish: <\/strong>Thank you.<\/blockquote>"}};

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

The company is called Ditto. Not a household name in defense contracting. But if just got an indefinite delivery, indefinite quantity contract from the Air Force, worth a potential billion dollars. Joining the Federal Drive with Tom Temin with the Air Force requirement it’s hoping to meet, Ditto CEO and co-founder Adam Fish.

Interview transcript:

Tom Temin: Mr. Fish, could to have you on.

Adam Fish: Thank you for having me.

Tom Temin: Tell us what the problem the Air Force came to you with and awarded you this contract for.

Adam Fish: We’re a young company based in California as a technology startup. And the larger aspect to this contract really goes back to the JADC2 efforts, where the DoD, including the Air Force, is really trying to re-imagine their entire network system behind all their military weapons systems. And so that’s at the heart of what Ditto does. Ditto is a software platform that enables devices to connect and share data in real time, even in very austere or degraded environments, which is exactly you know, what the military is operating in.

Tom Temin: So it sounds like the need is to connect, say, centralized systems or even cloud systems with the increasing thing you hear about so much is edge computing, which might be the cube computers, or the self contained data centers, many, many form factors that go with military units, as you say, into those austere environments?

Adam Fish: Yes, absolutely. And this has become an ever more important aspect as the DoD has kind of shifted its focus towards near peer competitors, where not only do you have to worry about the austere conditions that the military is operating in. But you also have to worry about having your communication systems broken, or jammed. And that’s something that is a real possibility when thinking about near peer competitors. And so as a result, they really want to make sure that their networks are resilient to those problems. And the way to do this is actually to go back to kind of like the origins of the internet, where it was very much a peer to peer system, you know, back ARPANET, Mil Net, those early, early versions of even the commercial internet, were all based in a peer to peer system, where devices can operate semi independently. Unfortunately, in the modern era, cloud computing has kind of inverted that, where things have become centralized, where you have a central server, and if that breaks, devices can’t talk to each other. And that’s not going to work for the military in these environments. And so they want to get back to a system that’s resilient and can work, whether it’s at the edge or in the cloud, in a more peer to peer mesh manner. And our software platform that is something that helps enable applications to do that.

Tom Temin: Yeah, that’s always the tension over the history of computing really is the central server and the small, thin clients versus that peer to peer where the computing is distributed. That’s the central issue here?

Adam Fish: Exactly. And it’s funny how, when you look back over the course of computing history, this has kind of come and gone in waves, where it started out as a very centralized system where mainframe computers, you know, people would log in, and it actually it was kind of like a cloud computer where you would rent time at the mainframe. But then personal computers led for it to become distributed again. And then cloud computing kind of went back to the mainframe model. What we’re seeing, though, and this is in the military, but even outside of the military, is that with mobile devices, edge type computing systems, you now have the processing power to do things at the edge versus having to go back to that central system. So it’s really a confluence of several factors is that the computing system operating at the edge can actually do more today. And as a result, the military wants to take advantage of that.

Tom Temin: We were speaking with Adam Fish, He’s CEO and co-founder of Ditto, which just got a very large Air Force contract. And just briefly, what is the technological means by which you can make sure that data is synchronized from central locations through an austere or interrupted environment to the edge computers, when in fact, the connections may not be there?

Adam Fish: Great question. The core premise behind Ditto is that today, any type of application, if you go into your mobile phone, and you open up whatever your favorite app is, that application is going to talk to a central computer to get all the data that it needs. And so even if you send a message and you say, hey, I want to text message a friend, it’s going to first go through a central server to go to that other person’s device. And that central system is a point of failure. That’s how we think about it, like from a software architecture perspective. And having points of failure in your system is risky when lives are on the line. And so that is what the military is trying to solve is how can we take the points of failure between our data communication links and remove them and create multiple ways that data can move. And so the way that that works is that devices need to be able to create arbitrary connections with each other, such as two mobile devices, if you were standing next to each other, why can’t that device just transmit the data to the other device using something like Wi-Fi, or Bluetooth, a local radio versus having to use the cellular connection to go back to a server to send the data. And so that’s the problem that the military has, in a big way, because they have so many different weapons systems with different radios. And so they have the capability to transmit data in different directions. But right now, those things don’t really talk to each other. And so the JADC2 efforts broadly across the DoD, ABMS for the Air Force, which is what we’re involved with is a way to rethink how all those weapons systems talk to each other. So they have resilient paths to transmitting data so that it doesn’t get blocked.

Tom Temin: And you can always fly a carrier pigeon holding a thumb drive, I guess, in the last analysis, if you have to.

Adam Fish: Exactly. It’s a real thing. I think, unfortunately, in a lot of situations, movement of data could actually be involved in a more physical manner where people are carrying thumb drives, hard drives, that does happen. And it’s something to think about is like that might be the most relevant or safe path to move data in certain situations. But at the end of the day, the software systems need to be able to handle data moving in multiple different paths. And that’s a very different approach to building software than having one that always talks to a central server.

Tom Temin: And this is an IDIQ contract, which generally is for goods. So how does that work? Are they buying you copy by copy, depending on the unit that needs it? That is to say, are you offering a cloud based service, apparently not, sounds like something that is replicated locally, as traditional software licenses.

Adam Fish: Yeah, it’s actually a little bit of both, because at the end of the day, there is a need for cloud systems, it’s not that everything can work on the edge, no one would want that there are benefits of both where you need to have the ability for data capture at edge locations and have those systems to be able to talk to each other independently. But then that data needs to get back to big powerful central systems. So you can do analysis on them. And that really would be the ideal situation, especially for ISR type missions, where you might be out flying missions capturing tons of information at the edge. And there might be some processing that happens there in real time. But then there needs to be the ability to move back that data to big systems to analyze it, and then push those insights back out. And so that whole loop of data capture to data analysis, and then action off of that, that, you know, it’s commonly thought of as like the chain that the DoD is trying to shorten, so that they can act on that as quickly as possible. And then that really captures the larger JADC2 effort. And so our software has applicability across all of that. We have edge software, that would be licensed, but there is also a cloud component for the analysis part as well.

Tom Temin: So the IDIQ aspect, then is that different pieces of the Air Force have the opportunity to take advantage of the contract?

Adam Fish: Correct. Yeah. And it could be used for different purposes, different use cases, but broadly covering any of our software. And so there is definitely a variety of use cases that we see. For example, one would be involved with the Agile Combat Employment effort, ACE, where there is a lot of need to be able to quickly create like a remote base, perhaps in a remote area where everything has to be flown there and be carried in. And so you have to basically just like create everything very quickly. And so as a result, you need software that’s very adaptable to that situation. And so that’s where our edge software could really help in terms of the data capture that that remote location, but then it could be transferred back to a cloud back end for analysis.

Tom Temin: Not many companies with 30 people get billion dollar potential IDIQs through the Federal Acquisition Regulations, or the DFAR. A lot of them are getting OTA deals but the sounds like a DFAR-based deal. How did you even know about the Air Force?

Adam Fish: Yeah, it’s actually a bit surreal. Speaking personally, when we got started, there is definitely a view in Silicon Valley of like, hey, is it a good idea to work with the government, things can take a long time working with the government. It’s a little obscure, and there was part of us, our team that was a little skeptical of it just because of that history, but this was something that was really exciting for us, and we felt like it was absolutely necessary that we needed to find a way into the military, given the nature of our product and how much impact it can have. And so we actually got started with the Air Force in a program called AFWERX. And so it’s a Small Business Innovation program that is designed to connect commercial companies like ditto and bring them into the government. And so that was a great pathway for us into and you know, it’s continued to snowball to where we are today. But it’s definitely been very exciting ride.

Tom Temin: So in many ways, the government was one of your seed contracts to really get revenue into the company.

Adam Fish: Correct. Yeah. And that that took a concerted effort where we wanted to make that a priority. And that’s something that I’m very proud of, because I think the impact the potential here is, is really significant. And I think, you know, more broadly, more technology companies need to be paying attention. We have big challenges ahead of us as a country in thinking about Russia and China and others. And technology is going to be a key player in that. And so right now, the best technology is not being built inside the government. It’s being built in commercial companies. And I’m excited that Ditto is able to take our technology and have an impact.

Tom Temin: Adam Fish as CEO and co-founder of Ditto, thanks so much for joining me.

Adam Fish: Thank you.

]]>
https://federalnewsnetwork.com/air-force/2022/05/this-tiny-vendor-got-a-big-air-force-contract-and-defied-the-odds/feed/ 0
NSA quietly re-awarded its Wild and Stormy cloud contract https://federalnewsnetwork.com/reporters-notebook-jason-miller/2022/04/nsa-quietly-reawarded-its-wild-and-stormy-cloud-contract/ https://federalnewsnetwork.com/reporters-notebook-jason-miller/2022/04/nsa-quietly-reawarded-its-wild-and-stormy-cloud-contract/#respond Tue, 26 Apr 2022 19:06:59 +0000 https://federalnewsnetwork.com/?p=4029275 The National Security Agency’s Wild and Stormy cloud procurement continues to live up to its name.

Four months after NSA lost what many believe to be its first ever protest of a contract award at the Government Accountability Office, it re-awarded the 10-year cloud contract, known by the distinctive moniker, which could be worth as much as $10 billion to Amazon Web Services.

The spy agency made the re-award in February, but details just surfaced in the last week.

An NSA spokesperson confirmed the agency’s decision.

“This contract is a continuation of NSA’s Hybrid Compute Initiative to modernize and address the robust processing and analytical requirements of the agency,” the spokesperson wrote in an email to Federal News Network. “Consistent with the decision in [the GAO protest] case, the agency has reevaluated the proposals and made a new best value decision.”

Sources also confirmed that Microsoft, which won the initial protest at GAO in October, decided not to protest the re-award to AWS, despite what many believe is a titled playing field.

A source, who requested anonymity in order to speak to the press, said a new protest would’ve just delayed the process, which would be detrimental to NSA and possibly national security.

But the source added, NSA’s decision does raise concerns about another single award contract for cloud services, in this case classified and top secret instances. Experts continue to question NSA’s decision especially after the controversial JEDI acquisition collapsed under immense pressure and scrutiny of its single award plan, and the move by the intelligence community from a single vendor — AWS — under the C2S vehicle to multiple cloud vendors under the C2E vehicle.

Additionally, sources highlight NSA’s decision again continues to, at least, offer the perception of special treatment for AWS. Sources says under the C2S contract, NSA and its intelligence community partners supported the development of AWS’s secret cloud instance while other cloud service providers received no financial or other type of benefit.

15 month acquisition saga

As for Wild and Stormy, NSA issued the solicitation in November 2020 and made the award to AWS in July under a two-phased best-value trade off approach.

AWS and Microsoft advanced to phase 2. NSA rated AWS higher and offered more value than Microsoft despite a base price of $482 million compared to $422 million, according to GAO’s bid protest decision.

Microsoft filed a protest on July 21 claiming NSA misevaluated proposals under the technical factor, under the management factor and around total price. Microsoft claimed that “the agency’s best-value selection decision was improper, and that NSA failed to meaningfully consider Microsoft’s lower price as part of the price/ technical tradeoff.”

GAO sustained Microsoft’s protest and recommended “NSA reevaluate technical proposals, consistent with this decision, and based on that reevaluation, perform a best value tradeoff and make a new source selection decision.”

NSA declined to offer any more details about how it reevaluated the proposals and how it came to the new award decision.

Joe Petrillo, an attorney with Smith Pachter McWhorter, told the Federal Drive with Tom Temin in December that GAO’s recommendation didn’t require NSA to reopen discussions or the Microsoft and AWS to revise bids.

“It’s up to NSA to decide how to implement this. They may have valid reasons for wanting to reopen and reevaluate the proposals,” Petrillo said. “One of the issues, interestingly enough, that wasn’t successful, although GAO did note, NSA should take it into account was there was a question about how the evaluated prices were developed, and how they were evaluated. They consisted of three sample task orders, and then prices for five different benchmarks. Those were all totaled, although it seemed that the benchmark prices, which were very small in comparison to the task order prices, in actual performance, those benchmark prices would constitute much more of the total price. Somehow the evaluation system didn’t take that into account. And NSA might want to fix that, but that would probably require a new round of proposals.”

What NSA exactly did this second time around may only be known by a handful of people involved in the procurement, but given the lessons learned with JEDI, C2S and the broad move to multi-cloud in the public and private sector, the single award is perplexing. It may make perfect sense to NSA now, but it’s hard to imagine locking any organization in to even one top secret cloud offering when others are obviously available is a smart decision over the long term.

]]>
https://federalnewsnetwork.com/reporters-notebook-jason-miller/2022/04/nsa-quietly-reawarded-its-wild-and-stormy-cloud-contract/feed/ 0
DISA’s milCloud replacement is open for business https://federalnewsnetwork.com/ask-the-cio/2022/04/disas-milcloud-replacement-is-open-for-business/ https://federalnewsnetwork.com/ask-the-cio/2022/04/disas-milcloud-replacement-is-open-for-business/#respond Fri, 22 Apr 2022 16:06:31 +0000 https://federalnewsnetwork.com/?p=4022146 var config_4022244 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/dts.podtrac.com\/redirect.mp3\/pdst.fm\/e\/chrt.fm\/track\/E2G895\/aw.noxsolutions.com\/launchpod\/adswizz\/1128\/042122_askciodisawoods_web_6jky_cd91b3b7.mp3?awCollectionId=1128&awEpisodeId=01382c52-f3f3-4a6d-b541-c40fcd91b3b7&awNetwork=322"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2018\/12\/AsktheCIO1500-150x150.jpg","title":"DISA\u2019s milCloud replacement is open for business","description":"[hbidcpodcast podcastid='4022244']nn<em>Best listening experience is on Chrome, Firefox or Safari. Subscribe to Ask the CIO on <\/em><a href="https:\/\/itunes.apple.com\/us\/podcast\/federal-drive-with-tom-temin\/id1270799277?mt=2"><em><span style="color: #0070c0;">Apple Podcast<\/span><\/em><span style="color: #0070c0;">s<\/span><\/a><em>\u00a0or\u00a0<a href="https:\/\/www.podcastone.com\/federal-drive-with-tom-temin?pid=1753589">PodcastOne<\/a>.<\/em>nnWhen the Defense Information Systems Agency decided to end its milCloud offering, it didn\u2019t mean the end of on-premise cloud options for its Defense customers.nnDISA is replacing that long-time, possibly underutilized offering with something new called Stratus.nnSharon Woods, the director of Hosting and Compute Center at DISA, said Stratus is taking the best of milCloud and improving it to help military services and defense agencies meet their ever-changing IT modernization needs.nn[caption id="attachment_3738203" align="alignleft" width="300"]<img class="size-medium wp-image-3738203" src="https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2021\/11\/sharon-woods-2-300x300.jpg" alt="" width="300" height="300" \/> Sharon Woods is the director of the Hosting and Compute Center at DISA.[\/caption]nn\u201cIt's its own offering in its entirety. The idea with any kind of on-premise cloud capabilities that you want it to mirror commercial cloud as much as you can. You want it to be elastic. You want it to be automated. You want it to be self-service, and self-provisioning. I think the self-service component gives control to mission owners so that they can go in there and very quickly spin something up and spin something down. Everyone associates that with commercial cloud,\u201d Woods said on <a href="https:\/\/federalnewsnetwork.com\/category\/radio-interviews\/ask-the-cio\/">Ask the CIO<\/a>. \u201cThe idea with an on-premise cloud is to replicate those characteristics as much as you possibly can, except that the servers are in our data centers because some applications are not ready to operate in commercial cloud. Stratus is this nice in-between step where they can get their applications and workloads more virtualized and operating in a way that can actually consume and use that technology where it's not so tied to the hardware, which often is what happens now that this application only works if you have this very specific piece of hardware.\u201dnnDISA decided to <a href="https:\/\/federalnewsnetwork.com\/defense-main\/2021\/12\/disa-to-let-milcloud-2-0-expire-in-may\/">end the milCloud initiative<\/a> in December after Lt. Gen. Robert Skinner, the director, decided it <a href="https:\/\/federalnewsnetwork.com\/defense-main\/2021\/10\/disa-launches-clean-sheet-budget-review-to-help-advance-new-strategic-priorities\/">no longer made financial<\/a> or operational sense. Users of milCloud 2.0 and 1.0 must move off of the platform by May.nnDISA awarded a contract to CSRA in June 2017 to develop and run the commercial cloud offering. GDIT bought CSRA in April 2018 for $9.7 billion.nnThe milCloud 2.0 contract included a three-year base with five one-year options, and it was worth as much as $498 million. This June would have been the third option period for the program.n<h2>Best value for hybrid cloud<\/h2>nWoods said Stratus will help DISA customers improve how they manage data, particularly around the cost of moving data between on-premise and commercial clouds.nn\u201cStratus lets you say, \u2018OK, this is the dedicated hardware for you, you're going to put your data here so that you know how much it costs and then you will do your transactions accordingly.\u2019 There's a number of use cases or Stratus makes a lot of sense. As mission owners get smarter and smarter and smarter about working in commercial cloud,\u201d she said. \u201cWe're focused on delivering of best value capability. It needs to make sense in terms of how the requirements are met. It needs to make sense in terms of the price. And if it doesn't, then it needs to be sunset, and Stratus is no exception. We'll certainly be managing it and watching it closely. But I do think a hybrid cloud capability is a requirement that exists now and will for a while. And so we have to deliver something and right now Stratus is the capability that we think is best value.\u201dnnStratus is already operational, received its authority to operate (ATO) at the unclassified, classified and secret levels and is open for use by DoD customers.nnWoods said DISA is making Stratus as self-service as users want it to be, meaning they can ask for help or just send money and take care of standing up a virtual machine instance on their own.nnAs for milCloud, Woods said all users must be out of the platform by May 20.nn\u201cWe are involved with every single mission partner that is in the milCloud 2 environment to help them get to whatever target environment they want to get to. It\u2019s all about being an honest broker. We did not push them to go in any particular place. I'd love to see them go to Stratus, but some folks were ready to go to commercial cloud. And we absolutely had a number of mission partners go to commercial cloud or they are going to commercial cloud instead of Stratus,\u201d she said. \u201cAnything and everything that mission partners need to get out of the environment, we are there a phone call away. We're trying to be really aggressive about making sure we're are providing the support and not just hanging back and waiting to see if there's a problem.\u201dn<h2>Two other cloud services<\/h2>nWhile Stratus is their latest initiative, the HACC also has been pursuing <a href="https:\/\/federalnewsnetwork.com\/defense-main\/2022\/03\/dod-cloud-exchange-disas-sharon-woods-on-technicians-of-the-future\/">several other cloud-related efforts<\/a>, including infrastructure-as-code and containers-as-a-service offerings.nnWoods said these and other pilots are part of how the HACC is creating hybrid capabilities that helps customers modernize applications and take advantage of commercial-like technologies.nnIn DISA\u2019s <a href="https:\/\/federalnewsnetwork.com\/reporters-notebook-jason-miller\/2022\/01\/disa-to-industry-resellers-system-integrators-need-not-apply-to-provide-cloud-services\/">strategic plan<\/a> released in December, the HACC received 10 lines of effort across the five broad lines of effort. The HACC released its own action plan to meet those goals.nnWoods said few of their goals focus on automation to enable and accelerator the use of cloud services, whether on-premise or in the commercial sector.nnThe containers-as-a-service and infrastructure-as-code are two examples of how the HACC is doing that.nnWoods called the containers-as-a-service a \u201cbellwether\u201d for their entire cloud strategy.nn\u201cI'm extremely excited about and I'm really proud of the team because one of my mantras is that we need these microsuccesses where you're delivering minimum viable products from ideation to the delivery of an initial prototype in six months or less. That's not necessarily something that people are used to seeing within the federal government,\u201d she said. \u201cBut that's what the HACC is going to be doing. Containers-as-a-service is an example where that is, in fact, what happened. The premise, and why I say it's a bellwether for the HACC strategy, is that it's taking Kubernetes, OpenShift in particular, and rather than deploying it in the cloud, it's deploying it in the traditional data centers.\u201dn<h2>DISA's offering Infrastructure-as-code<\/h2>nShe said DISA customers are deploying web servers as part of the pilot using virtual machines that come secure and ready to be used.nn\u201cBy using a web server that's containerized, we can take the container and can make sure that it's configured as well as it can be. That becomes the thing that you deploy, and if a mission partner has a presence in the cloud as well, you've created a situation where now the technologies are standard and they're able to communicate across each other from data center to commercial cloud,\u201d she said. \u201cWe've given them a capability that gives them a really awesome jumpstart to integrate with commercial cloud services. That started in November as an idea and we've already delivered a prototype. We're working right now to implement it with our first customer.\u201dnnThe infrastructure-as-code effort is a bit further ahead from an operational standpoint.nnThe Army Corps of Engineers already tested it out with the HACC developing an application in a few hours instead of something that may have taken as much as 38 weeks to do previously.nn\u201cIt was automated. It was validated. You start removing some that human error component as well, which just improves security and improves speed to mission and all these things,\u201d Woods said. \u201cThat's what infrastructures code is, these automated pre-configured cloud environments with privileged identity and continuous monitoring security policies around it. We have well over a dozen customers so it is one of the success stories because we took it from ideation to delivery in less than six months. We've had well over a dozen different customers consume it both in a research capacity as well as production.\u201d"}};

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Ask the CIO on Apple Podcasts or PodcastOne.

When the Defense Information Systems Agency decided to end its milCloud offering, it didn’t mean the end of on-premise cloud options for its Defense customers.

DISA is replacing that long-time, possibly underutilized offering with something new called Stratus.

Sharon Woods, the director of Hosting and Compute Center at DISA, said Stratus is taking the best of milCloud and improving it to help military services and defense agencies meet their ever-changing IT modernization needs.

Sharon Woods is the director of the Hosting and Compute Center at DISA.

“It’s its own offering in its entirety. The idea with any kind of on-premise cloud capabilities that you want it to mirror commercial cloud as much as you can. You want it to be elastic. You want it to be automated. You want it to be self-service, and self-provisioning. I think the self-service component gives control to mission owners so that they can go in there and very quickly spin something up and spin something down. Everyone associates that with commercial cloud,” Woods said on Ask the CIO. “The idea with an on-premise cloud is to replicate those characteristics as much as you possibly can, except that the servers are in our data centers because some applications are not ready to operate in commercial cloud. Stratus is this nice in-between step where they can get their applications and workloads more virtualized and operating in a way that can actually consume and use that technology where it’s not so tied to the hardware, which often is what happens now that this application only works if you have this very specific piece of hardware.”

DISA decided to end the milCloud initiative in December after Lt. Gen. Robert Skinner, the director, decided it no longer made financial or operational sense. Users of milCloud 2.0 and 1.0 must move off of the platform by May.

DISA awarded a contract to CSRA in June 2017 to develop and run the commercial cloud offering. GDIT bought CSRA in April 2018 for $9.7 billion.

The milCloud 2.0 contract included a three-year base with five one-year options, and it was worth as much as $498 million. This June would have been the third option period for the program.

Best value for hybrid cloud

Woods said Stratus will help DISA customers improve how they manage data, particularly around the cost of moving data between on-premise and commercial clouds.

“Stratus lets you say, ‘OK, this is the dedicated hardware for you, you’re going to put your data here so that you know how much it costs and then you will do your transactions accordingly.’ There’s a number of use cases or Stratus makes a lot of sense. As mission owners get smarter and smarter and smarter about working in commercial cloud,” she said. “We’re focused on delivering of best value capability. It needs to make sense in terms of how the requirements are met. It needs to make sense in terms of the price. And if it doesn’t, then it needs to be sunset, and Stratus is no exception. We’ll certainly be managing it and watching it closely. But I do think a hybrid cloud capability is a requirement that exists now and will for a while. And so we have to deliver something and right now Stratus is the capability that we think is best value.”

Stratus is already operational, received its authority to operate (ATO) at the unclassified, classified and secret levels and is open for use by DoD customers.

Woods said DISA is making Stratus as self-service as users want it to be, meaning they can ask for help or just send money and take care of standing up a virtual machine instance on their own.

As for milCloud, Woods said all users must be out of the platform by May 20.

“We are involved with every single mission partner that is in the milCloud 2 environment to help them get to whatever target environment they want to get to. It’s all about being an honest broker. We did not push them to go in any particular place. I’d love to see them go to Stratus, but some folks were ready to go to commercial cloud. And we absolutely had a number of mission partners go to commercial cloud or they are going to commercial cloud instead of Stratus,” she said. “Anything and everything that mission partners need to get out of the environment, we are there a phone call away. We’re trying to be really aggressive about making sure we’re are providing the support and not just hanging back and waiting to see if there’s a problem.”

Two other cloud services

While Stratus is their latest initiative, the HACC also has been pursuing several other cloud-related efforts, including infrastructure-as-code and containers-as-a-service offerings.

Woods said these and other pilots are part of how the HACC is creating hybrid capabilities that helps customers modernize applications and take advantage of commercial-like technologies.

In DISA’s strategic plan released in December, the HACC received 10 lines of effort across the five broad lines of effort. The HACC released its own action plan to meet those goals.

Woods said few of their goals focus on automation to enable and accelerator the use of cloud services, whether on-premise or in the commercial sector.

The containers-as-a-service and infrastructure-as-code are two examples of how the HACC is doing that.

Woods called the containers-as-a-service a “bellwether” for their entire cloud strategy.

“I’m extremely excited about and I’m really proud of the team because one of my mantras is that we need these microsuccesses where you’re delivering minimum viable products from ideation to the delivery of an initial prototype in six months or less. That’s not necessarily something that people are used to seeing within the federal government,” she said. “But that’s what the HACC is going to be doing. Containers-as-a-service is an example where that is, in fact, what happened. The premise, and why I say it’s a bellwether for the HACC strategy, is that it’s taking Kubernetes, OpenShift in particular, and rather than deploying it in the cloud, it’s deploying it in the traditional data centers.”

DISA’s offering Infrastructure-as-code

She said DISA customers are deploying web servers as part of the pilot using virtual machines that come secure and ready to be used.

“By using a web server that’s containerized, we can take the container and can make sure that it’s configured as well as it can be. That becomes the thing that you deploy, and if a mission partner has a presence in the cloud as well, you’ve created a situation where now the technologies are standard and they’re able to communicate across each other from data center to commercial cloud,” she said. “We’ve given them a capability that gives them a really awesome jumpstart to integrate with commercial cloud services. That started in November as an idea and we’ve already delivered a prototype. We’re working right now to implement it with our first customer.”

The infrastructure-as-code effort is a bit further ahead from an operational standpoint.

The Army Corps of Engineers already tested it out with the HACC developing an application in a few hours instead of something that may have taken as much as 38 weeks to do previously.

“It was automated. It was validated. You start removing some that human error component as well, which just improves security and improves speed to mission and all these things,” Woods said. “That’s what infrastructures code is, these automated pre-configured cloud environments with privileged identity and continuous monitoring security policies around it. We have well over a dozen customers so it is one of the success stories because we took it from ideation to delivery in less than six months. We’ve had well over a dozen different customers consume it both in a research capacity as well as production.”

]]>
https://federalnewsnetwork.com/ask-the-cio/2022/04/disas-milcloud-replacement-is-open-for-business/feed/ 0