Federal Cloud Report – Federal News Network https://federalnewsnetwork.com Helping feds meet their mission. Wed, 15 Apr 2020 18:19:32 +0000 en-US hourly 1 https://federalnewsnetwork.com/wp-content/uploads/2017/12/cropped-icon-512x512-1-60x60.png Federal Cloud Report – Federal News Network https://federalnewsnetwork.com 32 32 IARPA to offer potential cure for employees’ ‘linkclickitis’ disease https://federalnewsnetwork.com/cloud-computing/2019/02/iarpa-to-offer-potential-cure-for-employees-linkclickitis-disease/ https://federalnewsnetwork.com/cloud-computing/2019/02/iarpa-to-offer-potential-cure-for-employees-linkclickitis-disease/#respond Mon, 25 Feb 2019 12:53:54 +0000 https://federalnewsnetwork.com/?p=2266578 The intelligence community, like every other federal and private sector organization, suffers from the common employee disease of “linkclickitis.”

It’s described by doctors as a condition where the employee has an uncontrollable urge to press the left button on the mouse while hovering over a link sent by email.

But the good doctors, err developers, from the Intelligence Advanced Research Projects Agency (IARPA) may have found a cure or at least a way to isolate the disease so it doesn’t do harm to the rest of the body.

Kerry Long, a program manager at IARPA, said his team is in final testing of an approach, called Virtuous User Environment (VirtUE) that puts email, applications, data and other key functions in separate cloud containers. About two years ago, IAPRA released a broad agency announcement asking for help in solving the spear phishing plague that was, as Long described it, “eating our lunch” across the IC.

Head shot of Kerry Long
Kerry Long, program manager, IARPA, Office of the Director of National Intelligence

“In VirtUE, we created containers that take on different user roles, which could be browsing email or working on a document. Each of those resources have different risk profiles,” Long said at FCW’s cloud security workshop on Feb. 21 in Washington, D.C. “So browsing your email is the riskiest thing you are doing and second is going to the public internet and in the environment we give you today, we combine all those risks and then it surprises and shocks us when bad things happen. The design of today’s environment is basically  encouraging that. In many ways, the VirtUE program was designed to combat.”

He said each function has its own container that has all the protections needed for that role, and those containers, which sit in a cloud instance, can be shared and traded as needed.

“Imagine a user has five or six roles that they do during the day and there are five or six virtual machines running in the cloud all separate, doing these different roles so they are isolated from each other. But your interface hides that from you and when you are doing the role, it’s coming back to you in real time,” Long said in describing how VirtUE could work. “When an adversary breaks in now to one of your roles, gets really frustrated because there is nothing else there but email. There is no connectivity that he can ride between the roles.”

By limiting the roles to specific containers, IARPA and its industry partners are solving one of the most challenging cybersecurity problems of stopping network hopping once an attacker gets through the initial set of cyber defenses.

Long said the idea came from the growth of containers, which have revolutionized the way developers work and use the cloud.

In less than a month, IARPA will release the VirtUE approach to the public for review and use.

“Now that VirtUE is nearly completed, it’s about 75 percent successful,” he said.

Long said the cloud is a big reason why VirtUE can help provide a cure to the link clicking disease.

“The cloud gives us the opportunity to reinvent anything we want. But the lack of imagination because the things I know about, the things we like, the things we are used to are what we all want to see in the cloud. It makes sense, but IARPA’s main mission is to challenge cloud innovation. We keep asking to reimagine what it could be in the cloud. [The cloud providers] can make anything they want in the cloud and they do. Maybe the government needs to be more innovative.”

The success of VirtUE is coming at a perfect time for agencies. Not only does spear phishing continue to plague federal departments, but more and more agencies are moving to the cloud.

Ashley Mahan, the director of the Federal Risk Authorization Management Program (FedRAMP) said her organization has seen a 60 percent increase in the number of CFO Act agencies using the cloud over the last year.

She said the biggest trend is around agencies using software-as-a-service cloud offerings.

This is especially true under the FedRAMP Tailored approach, which is for low-risk cloud services to get through the security process more quickly.

“In 2018, more SaaS has come through the Tailored program than any other year, and 25 percent of in process SaaS offerings are going through the Tailored model,” Mahan said. “Over 20 agencies have expressed interest in working with a vendor through the Tailored model. We’ve had conversations with over 40 cloud service providers who are interested in bringing a product through Tailored. We have 11 cloud services that have achieved FedRAMP Tailored authorizations and we’ve had 10 agencies that have worked with them.”

]]>
https://federalnewsnetwork.com/cloud-computing/2019/02/iarpa-to-offer-potential-cure-for-employees-linkclickitis-disease/feed/ 0
Small agencies can be more nimble when modernizing IT, moving to the cloud https://federalnewsnetwork.com/cloud-computing/2019/02/small-agencies-can-be-more-nimble-when-modernizing-it-moving-to-the-cloud/ https://federalnewsnetwork.com/cloud-computing/2019/02/small-agencies-can-be-more-nimble-when-modernizing-it-moving-to-the-cloud/#respond Fri, 22 Feb 2019 20:43:10 +0000 https://federalnewsnetwork.com/?p=2263238

There are more than 75 small and micro agencies across the government, and like many of the large, well known agencies, they face similar IT modernization challenges.

These agencies that range from a few people to a few hundred employees face both advantages and disadvantages of being small or micro organizations.

On one hand, many small agencies don’t necessarily have the money or people to do full scale IT modernization, forcing them to deal with legacy technology, which, of course, opens them up to cybersecurity risks.

On the other hand, small agencies can move more quickly with fewer approvals and, let’s say, lawyers mucking up the processes. And even a little money can go a long way to move to the cloud or upgrade cyber defenses.

One small agency CIO told me last year that his plan was to move to the cloud and stop spending on commodity technology.

At the same time, small and micro agencies don’t necessarily have the support, or important oversight, from the Office of Management and Budget or Congress because they are not spending tens or hundreds of millions of dollars on IT projects. That also is both a benefit and a hindrance to IT modernization.

Another small agency CIO said he was surprised after moving to a small agency from a large one because so many small or micro agencies fly below the radar when it comes to OMB policies and requirements.

But like large agencies, small ones still have to be able to take advantage of the technologies like cloud and emerging ones like machine learning or robotics process automation to modernize their services and improve their cyber postures.

IT Modernization Efforts

We have about 60 percent of our infrastructure in the cloud and we now are actively working on the remaining 40 [percent]. The first piece in the cloud is when we moved all of our e-mail services to Office 365. The second major piece is when we moved our financial processing system to a software-as-a-service offering so that was a large IT mass that now sits over, in our case, Oracle managed cloud services.

The Challenges of Small Agencies in IT Modernization

One of the big pushes we’ve seen for small, medium and large agencies is ensuring when they are doing agreements there has been a built in schedule for training. As new advents of new services become available, especially for those clients who are operating in some sort of hybrid environment, it’s great to train everyone today. But what does that workforce look like in two years? As long as there is a continue process of being able to bring training in, being able to take advantage of new services and how that is incorporated into the curriculum is the key.

Engagement with Smaller Agencies

One very good surprise that we had when we migrated to a new cloud provider is they had software-defined network implemented in their network. We moved one of our applications to the cloud and we did it so smoothly that no one noticed.

Listen to the full show:

]]>
https://federalnewsnetwork.com/cloud-computing/2019/02/small-agencies-can-be-more-nimble-when-modernizing-it-moving-to-the-cloud/feed/ 0
Bureau of the Fiscal Service moving toward open ‘data lake’ powered by cloud https://federalnewsnetwork.com/cloud-computing/2019/02/bureau-of-the-fiscal-service-moving-toward-open-data-lake-powered-by-cloud/ https://federalnewsnetwork.com/cloud-computing/2019/02/bureau-of-the-fiscal-service-moving-toward-open-data-lake-powered-by-cloud/#respond Mon, 11 Feb 2019 16:34:01 +0000 https://federalnewsnetwork.com/?p=2248286

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

Amid a push to make government data more accessible, the Treasury Department’s Bureau of the Fiscal Service is moving away from structured data silos in favor of an open “data lake” approach.

“In order for us to get full utility of our data, we really needed to be able to create a platform where we can actually use all of it,” Tony Peralta, the agency’s data architect, said in a Jan. 31 interview following Veritas’ Public Sector Vision Day in Washington, D.C.

The data lake, an inventory of raw information, has played a crucial role in helping the bureau better assist people contacting their call centers, looking to settle delinquent debts.

Tony Peralta, data architect for the Treasury Department’s Bureau of the Fiscal Service

“One of the things that we’re looking to do with building a big data lake is basically the ability to take voice files, transcribe them, and be able to get sentiment analysis from that information,” Peralta said, both to help the agency meet its mission objectives to help meet the goal outlined in the President’s Management Agenda to use agency data as a strategic asset.

The bureau relies on the Workforce Community Cloud (WC2), a shared-service cloud architecture, provided by the Treasury Department’s Office of the Chief Information Officer, to ingest the call audio, transcribe the text and run the sentiment analysis.

Before moving to the cloud, the bureau faced constraints in conducting this level of analysis.

“The traditional enterprise data warehouse, more often than not, had analysts engaging data and pulling it down, doing their own scrubbing, then trying to run the model on a laptop or a workstation that is usually bound by its configuration. So you only had so much memory that you can execute. You only have so much hard-drive space that you can actually perform the analysis,” Peralta said. “The inefficiencies were very clear.”

The bureau’s cloud environment, which powers its data lake model, allows bureau employees to gather more data-driven insights.

“Now we have elastic compute. We can actually spin up clusters of our enterprise data warehouse footprint,” Peralta said. “We can build those things up to our needs and we can obviously tear them down. And being good stewards of taxpayer dollars, really maximizing the flexibility of cloud infrastructure and services.”

Through this sentiment analysis, the bureau looks to ensure a high level of customer service through its call center support.

“While we want to use that information to actually help us achieve our mission, we also, of course, want to use it to better serve the citizen,” Peralta said. “We want to know when we’re not hitting key pieces of information, where sentiment analysis may indicate we need to train some staff up to really provide a better service to the citizen.”

The bureau also looks to use the analysis to correlate whether certain actions from call center staff actually leads to callers resolving their delinquent debt.

“It’s just really leveraging our information strategically to basically say, how can we improve overall? So we’re guaranteeing to a certain extent that the citizen engaging with us are really getting a positive experience throughout, throughout their debt resolution conversations with our agents,” Peralta said.

However, he cautioned that moving to a cloud-powered data lake won’t serve as a “silver bullet” to solving an agency’s data management problems.

“Unless you have the right governance — the right understanding of what’s going in, what’s going out, how your data is defined, how it can be used, how it can’t be used — that data lake quickly becomes a data swamp,” Peralta said. “Governance is key — cataloging information so that the folks that are engaging with this diverse pool of information really understand the value that they can obtain from it.”

The need to build strong foundations in data management becomes more critical as more agencies pursue pilot programs in artificial intelligence and machine learning. 

At the same time, landmark legislation, like the recently passed Foundations for Evidence-Based Policymaking Act also emphasizes the need for open data to remain secure and untraceable to personally identifiable information.

“Our bottom line is public trust,” Peralta said. “It’s important for us to ensure that the data that we collect is basically used for the purposes for which it was collected.”

]]>
https://federalnewsnetwork.com/cloud-computing/2019/02/bureau-of-the-fiscal-service-moving-toward-open-data-lake-powered-by-cloud/feed/ 0
DISA takes center stage in DoD’s move to the cloud https://federalnewsnetwork.com/cloud-computing/2019/01/disa-takes-center-stage-in-dods-move-to-the-cloud/ https://federalnewsnetwork.com/cloud-computing/2019/01/disa-takes-center-stage-in-dods-move-to-the-cloud/#respond Mon, 28 Jan 2019 12:40:56 +0000 https://federalnewsnetwork.com/?p=2226271 The Defense Information Systems Agency wants to be right in the middle of all of the Defense Department’s cloud activities.

Whether it’s through the MilCloud 2.0 offering or the effort to bring the DoD’s Fourth Estate — the Office of the Secretary of Defense and all the defense agencies and offices — under one technology umbrella, or whether it’s the expansion of enterprise services, DISA executives say the agency is leading many of the reform and efficiency initiatives mandated by former Defense Secretary James Mattis.

Jason Martin is the director of DISA’s Services Directorate Executive.

“A lot of those initiatives are coming to DISA to execute. DISA is going to be the center stage for all the cloud activities taking place across the department. One of the other new things that came out recently is the demand to consolidate across the Fourth Estate all of those independently-owned and operated networks into one network and DISA has the responsibility to do that consolidation and convergence into that one network solution,” said Dave Bennett, DISA’s director of the Operations Directorate and chief information officer, at a recent panel sponsored by the AFCEA Washington, D.C. chapter. “So as you look across a variety of things, use cases, if you will, to drive the department in a new direction, the agency is central to many of those activities either in helping to design the solution or actually doing the execution.”

The MilCloud 2.0 is one of the key pieces to the Fourth Estate transition.

Jason Martin, director of DISA’s Services Directorate Executive, said the cloud hosting platform is a major part of conversations with all the military services, Defense agencies in the Fourth Estate and even the intelligence community.

“We’ve worked internally [with] the agency to identify all the applications capable of running in a virtualized environment. We, in fact, have over 30 today that are in the process of being built and/or are completely built running through the implementation process right now.”

Martin said with the move of the Fourth Estate, more applications will migrate over the next year or so.

And it’s the readiness of those applications to move to the cloud — private, public or otherwise — that Martin and other DISA executives want to make sure their partner agencies fully understand.

Terry Carpenter, the director of the National Background Investigative System Directorate at DISA, said this issue is one that the contractor community needs to take a more active role in.

“From an app owner perspective, those of you who are working with government partners and providing services to help them manage applications, please, show them, teach them, point out the lessons and things you’ve learned about how hard it is to get your applications ready for the cloud. Some will be easy. But there are those where you really do need to look at the application architecture. You need to look at the way in which you are managing the code and your cloud infrastructure. But that’s not the hard part. The hard part is putting all those pieces together and understanding the amount of work and time it takes to get ready for cloud to really use it.”

Martin added DISA doesn’t have developers to build apps or platforms so industry must fill that need across DoD.

Reshape apps and add automation

At the same time, Roger Greenwell, DISA’s risk management executive and authorizing official, said opportunities like the one the Fourth Estate has in front of them should be used to reshape applications by adding automation to improve security and functionality.

Lytwaive Hutchinson, the deputy director of the Joint Service Provider at DISA, said there are dozens of examples where customer agencies think they can just “lift-and-shift” an app to the cloud and in the end have to spend money to make the move happen.

Martin said DISA is working with the DoD CIO about what is the “art of the possible” when it comes to applications.

“We identified based on existing platforms what we have in our kit bag today to help them migrate,” he said. “We also provided additional ideas for toolsets for how they can potentially to migrate from one legacy platform to a newer platform. And then we look overall at the applications functionality to help them decide where it should reside for the long term.”

Martin said he expects that as the services and the Defense agencies begin to rationalize their applications, the use of dev/sec/ops will grow quickly.

Martin said in the end the goal is to reduce those major cost drivers, operations, sustainment and people.

“We have a lot of legacy applications. DFAS was with us [Wednesday]. We were talking about an application that was built 50 years ago. We will not automate that anytime soon. But that is indicative of what we have in the data centers over time,” he said. “We, like everyone else, want to evolve so we have to look at this in a couple of different ways. We have to build for the cloud, not just forklift apps to the cloud. You will not save a lot of money. Build for commodity, build higher up the stack and leverage software-as-a-service. So think before you build and consolidate before you hand off.”

]]>
https://federalnewsnetwork.com/cloud-computing/2019/01/disa-takes-center-stage-in-dods-move-to-the-cloud/feed/ 0
Army to rationalize 8,000 apps as part of setting cloud foundation https://federalnewsnetwork.com/cloud-computing/2019/01/army-to-rationalize-8000-apps-as-part-of-setting-cloud-foundation/ https://federalnewsnetwork.com/cloud-computing/2019/01/army-to-rationalize-8000-apps-as-part-of-setting-cloud-foundation/#respond Mon, 14 Jan 2019 15:27:03 +0000 https://federalnewsnetwork.com/?p=2207325 For the Army, the next 12 months is all about laying the foundation to meet both its tactical and enterprise requirements in a hybrid cloud.

The first step is to reduce the more than 8,000 applications the service currently supports.

“If you looked at these 8,000 apps, some of them are very new. Some of them are what we call antique but we have to keep them around. So it’s no one-size-fits all when it comes to where we are going,” Army chief information officer Lt. Gen. Bruce Crawford said on the Federal Cloud Report. “It’s not just about putting our data in the cloud. We’ve got to be able to access that data so our identity credentialing, authentication and access management steps have to be taken in order to complete the circle. It’s being able to access your data, anywhere, anytime and in all environments.”

Army Lt. Gen. Bruce  Crawford is the chief information officer /G-6.   (U.S. Army photo by Monica King/)

Crawford said the Army is going through the process of deciding which apps can migrate today, which ones will need more time and which apps they may need to get rid of altogether.

“I do not envision us migrating 8,000 apps. What I envision us doing is rationalizing down to about 5,000 apps,” he said. “The other realization that we came to with the market research we’ve done, not all apps need to be moved to the cloud. Some will remain in legacy data centers.”

The Army has been on a mission to test out different cloud services, both public and government only. It also has been under a strict mandate to reduce the number of data centers it runs as part of this effort.

Like many departments in the military and civilian world, the Army’s goal is to move as much of its data to the cloud and reduce the cost of managing data centers.

Crawford said the service has 1,334 data centers and a goal of getting down to 306 with four core data centers by fiscal 2022. The Army had a goal of closing 752 of them by the end of 2018, according to a 2016 presentation by the Program Executive Office, Enterprise Information Systems.

Crawford didn’t say if the service met that September goal, or how many data centers have been closed.

But the data center consolidation effort is key to this hybrid cloud effort because the Army always will maintain data that is too sensitive to put in a commercial cloud.

“What we are doing now is going through the rationalization process. Everything that I’ve learned from talking to the various CEOs and the various engineers from all types of companies who have gone out and actually migrated to the cloud, and all the market research I’ve seen, says you’ve got to set the environment first,” he said. “You’ve got to get a series of diverse pilots going to inform future design options of what the ultimate cloud will look like for the Army. You will see some of that happening this coming year from the Army.”

Additionally, Crawford said the Army wants to solve several “wicked” problems.

“We have some important data that sits in a couple of our data centers so while we are experimenting, we want to make sure we bring that with us as part of the experiment so we can solve some of these problems,” he said. “The other piece of it is more forward thinking. The synthetic training environment there is about a 7-to-8x increase in capacity required to realize this vision of a solider being able to train on any piece of turf, any piece of ground virtually anywhere, from any installations, and we have 288 installations across the Army. That will generate a lot of data.”

The synthetic training environment is one of the Army’s eight cross functional teams created under Secretary Mark Esper’s modernization plan.

“While we set the environment in 2019 and begin to migrate apps in 2019, we want to make sure we are addressing and laying the ground work for those synthetic training environment requirements as part of future modernization but also current readiness when we look at very important data we have sitting in legacy data centers,” Crawford said.

The Army tested commercial cloud capabilities at Redstone Arsenal between 2016 to 2018.

“We learned a lot about dealing with and engaging with a commercial vendor from security to the risk management framework and the importance of streamlining that particular process so we can actually get authorities to connect to the network,” Crawford said. “That pilot ended last spring. Where we are headed now is off on a new venture in terms of laying out the cloud hosting environment for the Army. What I can tell you, I’d expect significant activity in 2019 out of the Army when it comes to the cloud.”

]]>
https://federalnewsnetwork.com/cloud-computing/2019/01/army-to-rationalize-8000-apps-as-part-of-setting-cloud-foundation/feed/ 0
VA goes on acquisition spree to support its cloud habit https://federalnewsnetwork.com/cloud-computing/2018/08/va-goes-on-acquisition-spree-to-support-its-cloud-habit/ https://federalnewsnetwork.com/cloud-computing/2018/08/va-goes-on-acquisition-spree-to-support-its-cloud-habit/#respond Mon, 06 Aug 2018 14:47:24 +0000 https://federalnewsradio.com/?p=2001729 The Veterans Affairs Department’s spending on cloud services and cloud migration has been growing each of the last four years, reaching more than $860 million in 2017.

Now VA wants help in managing the cloud infrastructure, accessing the cloud through mobile devices and applying artificial intelligence to the assorted services and data.

Over the last 30 days, VA has been on a bit of a procurement binge, releasing requests for information, requests for quotes and other solicitations for vendors to provide them with services over the next few years.

John Everett, the executive director for demand management at the VA, said earlier this year during a panel discussion that his agency’s goals with modernization are around enabling the agency’s customers to use services faster and more easily.

Everett said VA has about 621 applications that have the potential to move to the cloud today. Going forward, he said VA is taking the buy-before-build approach to new capabilities, particularly by using software-as-a-service.

VA’s concept of making it easy for veterans to log-in once and get everything done no matter which part of the agency the service is coming from is not a new concept. And one way to achieve the “do once, use many” concept is through the use of application programming interfaces (APIs)

VA released a RFI for an API management platform that would help the agency bring its digital experience on par with commercial companies.

“VA is taking an API-first strategy to deliver the high quality digital experiences our users expect. A single set of APIs will power every VA digital service, and these same APIs will be exposed to approved third parties to build products and applications on top of VA services and data,” the RFI states. “These APIs across every vertical of VA’s business will enable VA users to receive a consistent, high quality experience across all VA communication channels (e.g., digital, phone, mail, in-person, etc.).”

VA said APIs will let Veterans Service Organizations and others reduce the amount of time they spending manually looking in VA systems to check the status of a claim or find out if a rating has been granted to a veteran.

“If VA were instead able to provide APIs to this information, authorized individuals would be able to access it more readily, improving the experience they can provide for Veterans and reducing VA costs,” the RFI states.

VA wants the vendor to provide agile development of APIs and to support its API gateway through an assortment of project management, product management, human-centered design, iterative development, user research and usability testing, automated testing support and automated monitoring and performance reporting.  The agency created the API gateway earlier this year as one of its “lighthouse” micro-purchasing efforts.

The contractor also must ensure all the APIs are deployed or migrated into a VA designated cloud.

At the same time, VA also issued a request for proposals under its T4 Next Generation multiple award contract for enterprise mobility management.

The initial RFI from May wants a vendor to migrate its mobile management system out of its FISMA high internal cloud and into a software-as-a-service in a moderate external cloud.

VA wants the device management system to support between 45,000 and 100,000 devices, and connect back to its Microsoft Office 365 email in the cloud.

Additionally, VA issued a task order against the General Services Administration’s IT schedule for a service-disabled veteran-owned small business to provide cloud solutions and engineering support.

The initial RFI states, the CIO’s office requires “cloud support services to assist VA in reviewing, validating and recommending improvements for planning, assessment, execution support, operations and enhancement of VA cloud computing capabilities.”

Finally, VA wants to enhance all of these cloud efforts through artificial intelligence.

The agency issued another RFI seeking service disabled veteran-owned small firms to provide AI services through a software-as-a-service offering to further improve the veterans’ experience.

“Office of Information and Technology has identified numerous business opportunities that leverage AI to provide veterans, caregivers and survivors with better access to information related to healthcare and benefits through various VA websites and portals. Many of these sites offer the ability to chat with a VA agent or to contact an agent by phone. However, maintaining a large staff of well-trained agents to handle the depth and breadth of questions and issues veterans and caregivers commonly seek assistance with during peak days and times is challenging and many times veterans and caregivers are unable to receive immediate assistance because agents are actively assisting other customers,” the RFI states. “AI can help minimize this issue by improving the speed of information retrieval, and the quality and accuracy of information provided to the end user. AI can be leveraged to assist VA agents and veterans/caregivers using either a chat or voice interface. AI can be trained to answer commonly asked questions, assist users in properly filling out forms, respond to routine issues, and assist VA agents to quickly locate key information relevant to a customer’s specific concern. Initially, AI must be taught using assisted learning, but once it is put into production it can learn over time to both expand and improve its own capabilities.”

VA said in the RFI that it already has begun developing AI epics and user stories related to the White House VA Hotline and various other web-based VA portals including the eBenefits portal, enrollment and eligibility Veterans Choice Program, and Affordable Care Act.

]]>
https://federalnewsnetwork.com/cloud-computing/2018/08/va-goes-on-acquisition-spree-to-support-its-cloud-habit/feed/ 0
Air Force moves portal to commercial cloud, begins migrating other apps https://federalnewsnetwork.com/cloud-computing/2018/07/air-force-moves-portal-to-commercial-cloud/ https://federalnewsnetwork.com/cloud-computing/2018/07/air-force-moves-portal-to-commercial-cloud/#respond Mon, 30 Jul 2018 11:41:16 +0000 https://federalnewsradio.com/?p=1992505

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

A big change occurred for airmen last month, but it may be a little while longer before they notice it.

The Air Force Portal, known for posting promotional lists and other information, now lives in in the cloud. The portal is only one of many apps that will eventually move to a commercial cloud environment.

The portal migrated to Amazon Web Services on June 30 and is opening up some new possibilities for the 750,000 users a month. The Air Force is putting some of its software in Microsoft’s Azure as well.

Hanscom Air Force Base, which hosts the part of the Air Force Lifecycle Management Center in charge of cloud is now in the process of moving other applications over to the cloud.

“We currently have 39 applications in our pipeline that are being refactored to go live in the cloud. We have eight applications right now, the portal being one of them,” said Kerry Coburn, program manager for common computing environment acquisitions at Hanscom. “Beyond that 39, we have another 84 that we are putting on contract … I expect that slowly our airmen are going to see a huge increase in the responsiveness of the apps that they use daily to do their jobs”

Coburn said the apps range from logistics applications that affect real-time, real-world applications to human resources functions to financial work.

The Air Force is conducting a hybrid approach to developing the apps by working with a prime contractor and small businesses to develop apps and migrate apps to the new cloud environment.

“In the future we are going to award another contract that will continue this work and open it up to more vendors to refactor and migrate applications,” Coburn said.

Less software, more savings

The AWS cloud uses a pay-as-you-go model instead of a flat fee to charge the Air Force, leading to savings when cloud traffic is down. It also means the contractor has an incentive to make the portal and the cloud as useful as possible so more airmen will use it.

“They have a vested interest in providing more and more services because it’s all metered usage. The more services they can provide the more money they can make, so they have an interest in providing better service,” said Bob Oshel, systems architect at Isobar, a company that helped the Air Force with the conversion process.

Coburn said the Air Force’s previous environment was very software heavy.

The Air Force previously used Defense Information Systems Agency data centers to host its applications.

“Now our services are running at a cost of about $140,000 a month, if you compare this to our heritage environment, where the software alone costs us $1.3 million, and don’t include the labor to support that system. That is a huge savings upfront,” Coburn said.

Oshel said there were a number of challenges the Air Force faced in migrating from the cloud environment.

“We had to upgrade, sometimes several versions of the software during the migration and then also in a traditional data center you things like network attached storage, which aren’t necessarily a one-to-one comparison to cloud services. Those were some challenges, as well as a lot of historical platforms, application platforms,” Oshel said.

The Air Force’s transition to commercial cloud services comes at a time when the military as a whole is trying to figure out how it will move to the cloud. DoD recently released the  final JEDI cloud solicitation and DISA is working on an enterprisewide cloud approach to office software called the Defense Enterprise Office Solutions (DEOS).

 

]]>
https://federalnewsnetwork.com/cloud-computing/2018/07/air-force-moves-portal-to-commercial-cloud/feed/ 0
Making cloud accessible to troops in battle faces a steep hill https://federalnewsnetwork.com/cloud-computing/2018/07/making-cloud-accessible-to-troops-in-battle-faces-a-steep-hill/ https://federalnewsnetwork.com/cloud-computing/2018/07/making-cloud-accessible-to-troops-in-battle-faces-a-steep-hill/#respond Mon, 23 Jul 2018 13:00:39 +0000 https://federalnewsradio.com/?p=1983301 Taking cloud to the very edges of the battlefield can be difficult, especially when the edges of the battlefield include air, sea and space.

Doing that while keeping the cloud secure from hackers makes the challenge even more difficult and the Navy is embarking on new exercises to make sure the hardware and software it uses for the cloud aren’t compromised.

Michael Kilcoyne, the command information assurance manager at Navy Facilities Engineering Commands, said the service is doing a set of experiments looking at supply chain management.

“We are going to be doing what we call cyber tabletop exercises,” Kilcoyne said during a July 11 speech at the Defense Systems Summit in Arlington, Virginia. “Those are kind of like an adversarial risk assessment, using threats, working with the Office of Naval Intelligence and other agencies to get what’s happening out in the wild with the supply chains to specific systems, how they would attack those systems and what impact they would have on the mission of our carrier strike force.”

Kilcoyne said that will be happening during fiscal 2019.

While cloud is not yet at the “tactical edge” in most cases, Kilcoyne said those exercises will help get the Navy there.

Some data challenges remain

“You have to look at the data. There’s some data that might be able to go into a public cloud. There’s other data that need to go in a more secure cloud.” Kilcoyne said. “Then you also have to have a good understanding of mission sets.”

He added the Navy is just now doing mission threat analysis where it decomposes various missions like nuclear command and control to find out what everything from software to infrastructure that is needed to complete the mission.

The analysis allows the Navy to prioritize what the service wants to move, how it wants to move it, where it wants to move it and then get things through the acquisition process to move to the cloud, Kilcoyne said.

The Navy isn’t the only service doing analysis to move to the cloud. Air Force Chief Technology Officer Frank Konieczny said his service is conducting similar planning.

“We’ve set up mission defense teams to actually look at issues with the mission thread and defend those mission threads. They are all key terrain and they involve not only computers and networks, they involve weapons systems,” Konieczny said.

Both men said the services need to be doing these analyses as the services simultaneously move into the cloud.

Of course, there are always challenges to doing analysis and moving to the cloud in new areas.

Both men said obviously the budget is the biggest obstacle to moving to the cloud in certain areas.

Other issues include legacy systems and getting them up to date and transferring data to cloud hosts.

“You have to system engineer these systems to work within the cloud. You have to look at the services,” Kilcoyne said. “I’ve seen everything from ‘You just bring your [web] application and we’ll supply the rest’ to ‘We’ll supply the hardware and the operating system and then you have to bring your database and all of that’.”

Culture is another issue, Konieczny said.

“As we talk about the PEOs, it’s a question of control over what they want to have and be in control of. You run into the issue of ‘I can’t move myself to the cloud because I don’t have a budget, because I haven’t allocated that and I won’t be able to do that until the next program objective memorandum comes through.’ The answer is move it. Get over it, you have to do this. That has been a problem space all along,” Konieczny said.

The Air Force has already set up a center to help PEOs move to the cloud to deal with those issues.

]]>
https://federalnewsnetwork.com/cloud-computing/2018/07/making-cloud-accessible-to-troops-in-battle-faces-a-steep-hill/feed/ 0
End-to-end dev/op automation critical for cloud success, DHS officials say https://federalnewsnetwork.com/cloud-computing/2018/07/end-to-end-dev-op-automation-critical-for-cloud-success-dhs-officials/ https://federalnewsnetwork.com/cloud-computing/2018/07/end-to-end-dev-op-automation-critical-for-cloud-success-dhs-officials/#respond Tue, 17 Jul 2018 15:33:42 +0000 https://federalnewsradio.com/?p=1982034 As agency software development teams engineer applications for cloud environments, end-to-end automation of the entire Dev/Sec/Ops workflow is critical for successful deployment of cloud services.

To that end, the Homeland Security Department is hoping by year end to launch Cloud Factory, a platform offering shared services capabilities to provide a fully automated provisioning and delivery lifecycle for cloud services.

Kshemendra Paul

“Cloud Factory is coming to fruition and with U.S. Digital Services we are doing an assessment” as the platform goes through an authority to operate, said Kshemendra Paul, cloud action officer with DHS.

The idea of end-to-end automation is critical for Dev/Sec/Ops and cloud transformation. “You can’t have an automation step and then a manual step, and an automation step and a manual step. That defeats the purpose,” Paul said. Paul spoke at the ACT-IAC Federal Insights Exchange and IT Management and Modernization COI Program, featuring the Department of Homeland Security Cloud Forecast, in Washington, D.C on July 11.

Cloud Factory will serve as a “reference of implementation of how to do end-to-end automation.” The aim is for DHS headquarters to collaborate with the department’s component organizations on sharing best practices, lessons learned, scripts, and code. Then, “DHS can coalesce around common approaches to realize automation for productivity gains,” Paul said.

Cloud Factory “supports the build, test and deploy aspects of dev/ops as well as the operational (production) support needed to host and secure the application and its mission,” the agency writes in its 2019 budget justification to Congress.

“The system will ingest user code, assemble the desired machine images (MI), customize the MI configurations, validate security configurations, and deploy the environment in hours as opposed to months. It will utilize account monitoring tools which the business owner will be able to view usage statistics, costs, utilization data and various at hand dashboards to ensure they are meeting mission objectives,” according to DHS.

U.S. Immigration and Custom Enforcement (ICE) and U.S. Citizenship and Immigration Services are the DHS components furthest out in terms of looking at cloud engineering and the tool chain, Paul noted.

At USCIS development teams are looking at automation as a process to help standup up a dev/op environment. In general, when developers want to make a change to an application, they create feature releases inside the application. The first step is to stand up infrastructure to support that application. They might use an automated software server like Jenkins to create a cloud formation template or a script tool to build out the infrastructure, said Steve Grunch, branch chief of enterprise cloud services with USCIS.

Then a development team would lay the application on the infrastructure where they might deploy configuration management tools to install the software and to make the required configuration changes. Automated testing will mostly likely be a part of the process. Once all these functions clear the gates, then a final job launches the new changes and executables out to the production environment.

“Ideally, what we are looking for automation [to do], in terms of that process, is [ensure] that the team that developed the application has developed an immutable process of getting that application code out into the cloud environment,” Grunch said. And, there should not be any manual process stubs that are occurring to get that product in service.

“At the end of the day, that is what we are looking for,” Grunch said. He noted that automation is being deployed in other areas such as for governance controls.

Data management critical

As agencies move to the cloud and modernize their networks and consolidate security operation centers, it is important that IT managers do not forget about the need to continue to mature the enterprise data management function, so the quality of data is better managed and there is better transparency of data, Paul said. There are movements within DHS to improve data management, which is making an operational impact on DHS components’ administration of their missions, he noted.

“To realize the targeted benefits for the cloud we have to be very aware about the data,” Paul said. “Vice versa, I don’t think the folks on the data side are going to make the progress they think they can make without reducing the friction on dealing with data from an infrastructure perspective.”


Rutrell Yasin is a freelance journalist.

]]>
https://federalnewsnetwork.com/cloud-computing/2018/07/end-to-end-dev-op-automation-critical-for-cloud-success-dhs-officials/feed/ 0
Senate praises DHS data center consolidation effort by opening up its wallet for 2019 https://federalnewsnetwork.com/cloud-computing/2018/07/senate-praises-dhs-data-center-consolidation-effort-by-opening-up-its-wallet-for-2019/ https://federalnewsnetwork.com/cloud-computing/2018/07/senate-praises-dhs-data-center-consolidation-effort-by-opening-up-its-wallet-for-2019/#respond Mon, 09 Jul 2018 11:14:44 +0000 https://federalnewsradio.com/?p=1964508 Senate appropriators want to give the Homeland Security Department’s chief information officer $60 million more in fiscal 2019 than in 2018.

Lawmakers expect the DHS CIO to use some of that increase to $382 million to continue the modernization and consolidation of the agency’s data centers, using both public and private cloud options.

“The committee is pleased with OCIO’s continued leadership in data center consolidation, which is enhancing the effectiveness, efficiency, and security of the department’s IT enterprise. Further, the committee commends OCIO for its efforts to collaborate with NASA to gain efficiencies by establishing IT operations centers at data center one and by encouraging other federal partners to co-locate with the department at its data centers,” appropriators write in the report in the Homeland Security spending bill. “In addition to budget justification materials and obligation plans, OCIO shall provide semiannual briefings to the Committee on the execution of its major initiatives and investment areas. Such briefings shall include details regarding cost, schedule, and the transfer of systems to or from department data centers or external hosts.”

DHS is preparing new acquisitions for the continued consolidation and modernization of its data centers.

Soraya Correa, DHS’ chief procurement officer, said the data center contracts expire in June 2020 and they are starting to work on the strategy to recompete those vehicles.

She said the CPO and CIO’s offices recently had an offsite meeting to discuss the path forward, including industry days and the like.

The data center strategy likely will be part of DHS’ larger move toward using multi-vendor hybrid cloud.

John Zangardi, the DHS CIO, said at the recent Washington Technology DHS Industry day, that his office laid out the policy and approach to go to cloud, and now comes the implementation part.

“I think you have to look at what folks need, and there will be different needs and you should make sure those needs align to what platform delivers the capability you need,” he said. “It’s multi, but I don’t want a hundred, that would be unmanageable. There is limit to the number you will have out there, and that would make sense to anyone who is involved in security. Reasonably speaking, you will need multiple platforms to deliver the right capability to the user.”

Stretch goals for cloud

Zangardi said he created a cloud steering group led by Claire Grady, the undersecretary of management, and includes CIOs from components and other CXOs, to develop the strategy and implementation.

He said the group will try to address the regulatory and policy challenges, and set metrics for moving to the hybrid cloud.

“About 29 applications are out in the cloud. There’s about another 70 in progress of transitioning out to the cloud,” Zangardi said. “Some components have been very aggressive in moving out there. Immigration and Citizenship Enforcement and Customs and Border Protection have done a fantastic job. There are smaller components and other components that need help. Everyone needs help in getting to the cloud in terms of contracting vehicles, in terms of my authority to operate (ATO) process, in terms of how we deal with privacy and in terms of network and security the whole thing.  There is a lot out there.”

Each component, including headquarters, will have goals around moving to the cloud. Zangardi said through these stretch goals, DHS also will better understand what challenges components face and how his and the Undersecretary for Management offices can help overcome them.

“We do want to go to the cloud and do want to go there smartly, and we are putting in place the pieces that will motivate and incentive the people to get out there,” he said. “When we find things that are blockages, we want to work through them as appropriate.”

A key piece to the cloud effort is DHS’ Cloud Factory initiative, which the agency says is a highly automated, secure, reliable set of managed services that allow for the dev/ops flow, feedback and innovation of various applications.

“The platform supports the build, test and deploy aspects of dev/ops as well as the operational (production) support needed to host and secure the application and its mission,” the agency writes in its 2019 budget justification to Congress. “The system will ingest user code, assemble the desired machine images (MI), customize the MI configurations, validate security configurations, and deploy the environment in hours as opposed to months. It will utilize account monitoring tools which the business owner will be able to view usage statistics, costs, utilization data and various at hand dashboards to ensure they are meeting mission objectives.”

Zangardi said the steering group will ensure the components are aware of and use Cloud Factory to help with application modernization efforts.

Network modernization and managed services

In addition to data centers and applications, Zangardi said the DHS network, known as One-Net, will depend heavily on cloud infrastructure.

In its 2019 budget justification, DHS proposes to spend $8.2 billion for One Net with most costs going to contracted support of data storage and cloud migration and infrastructure hardware and software management tools.

Zangardi said One Net, which is for unclassified and secret data, needs to be modernized because the network is relying too much on old technology.

“The way I want to do it is go to a managed service. Over the last few months, we’ve been working through what that entails,” he said. “We’ve been doing research. We’ve spent time looking at what other agencies like NASA, DoJ, TSA and what the Navy did when I was there with NGEN. We are trying to figure out what is the best approach when you move to a managed network. We haven’t gotten to all the decisions yet. As you can imagine, this is a big thing. It’s hard. We are just beginning the process of what the requirements look like, what is the statement of work, what is the acquisition strategy we want to pursue on this.”

Zangardi said he envisions a five-year contract with a five-year option that could be one of several models, government-owned, contractor-operated, or contractor owned-and-operated.

“One of the concern items I have as I go through this is how do I think about how do I keep industry interested in coming into this model where I’m looking for one prime vendor to run the full gamut of things, from end-user hardware to security to transport to enterprise services to managing my software licenses,” he said. “We have to make sure we provide to you the information that is on our network. We have to pull that all together so that’s a huge task to make sure the vendor understands the state of the network.”

DHS will release requests for informations, draft solicitations and industry days before releasing final request for proposals.

Zangardi said there still are a lot of questions about what the future of One Net will be — a single network or a convergence — and how to incentivize the components to join One Net.

Finally, Zangardi said DHS is moving to Windows 10 by the end of September and will release a policy to move the entire agency to Microsoft Windows Office 365 in the cloud. Right now, FEMA and ICE are using O365, and Zangardi wants to figure out what’s best for the rest of the agency.

]]>
https://federalnewsnetwork.com/cloud-computing/2018/07/senate-praises-dhs-data-center-consolidation-effort-by-opening-up-its-wallet-for-2019/feed/ 0
Deasy takes control of JEDI, but fate of DoD’s cloud steering group is up in the air https://federalnewsnetwork.com/cloud-computing/2018/07/deasy-takes-control-of-jedi-but-fate-of-dods-cloud-steering-group-is-up-in-the-air/ https://federalnewsnetwork.com/cloud-computing/2018/07/deasy-takes-control-of-jedi-but-fate-of-dods-cloud-steering-group-is-up-in-the-air/#respond Tue, 03 Jul 2018 11:15:52 +0000 https://federalnewsradio.com/?p=1964611

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

The Pentagon on Monday added some clarity to what it meant by a vague statement it issued a week earlier announcing that its new chief information officer would assume responsibility for the Defense Department’s “cloud initiative.” But a significant portion of the department’s cloud planning effort remains up in the air.

In a memo DoD provided to Federal News Radio, the department made explicit that the new CIO, Dana Deasy, is now in charge of the upcoming multi-billion dollar Joint Enterprise Defense Infrastructure (JEDI) contract. The memo, signed by Deputy Defense Secretary Patrick Shanahan, orders the program management office conducting the JEDI procurement to be transferred to Deasy’s “authority, direction and control.”

Previously, the JEDI effort — and DoD’s broader cloud strategy — had been overseen by the Cloud Executive Steering Group (CESG), a cross-functional leadership team Shanahan assembled last September and then restructured in January. While the latest iteration gave the CIO’s office a vote, the group was chaired by John Gibson, DoD’s chief management officer, and the Defense Digital Service had direct responsibility for conducting the JEDI acquisition.

In the memo, Shanahan said it’s also his “intent” that Deasy serve as the lead for the department’s overall cloud initiative. But in response to questions about whether the CESG still exists and what responsibilities it still has, the department said the answers are yet to be determined.

“As a part of designating CIO as lead for cloud, the department is reviewing existing governance structures and reporting chains and will take deliberate steps to realign authorities as necessary,” Heather Babb, a Defense spokeswoman said in a statement. “This process is expected to occur over the coming weeks.”

Under the CESG’s initial charter, the JEDI acquisition was to be only the first phase of Shanahan’s multi-step plan to “accelerate” DoD’s move to commercial cloud services, an initiative he had dubbed “a department priority.”

The second phase was supposed to have been led by the department’s Strategic Capabilities Office, which Shanahan had assigned the task of deciding which DoD systems should be migrated to the new cloud environment first and “operationaliz[ing] the mission using the security, software and machine learning technologies that cloud technology provides.”

The Defense Department had previously told vendors and lawmakers that it would release a final JEDI request for proposals by May. But those predictions were made before Deasy was sworn in as CIO, and before he took over responsibility for JEDI.

On Monday, DoD declined to offer an updated prediction for when an RFP might be released, but denied that Deasy had directed any significant changes.

“There has been no change to the JEDI Cloud acquisition, and DoD is continuing to review the final RFP before it is released. I do not have an exact date at this time,” Babb said.

To date, the aspect of the JEDI acquisition that has generated the most criticism from Congress and industry groups  has been the department’s decision to make the multi-billion dollar award to a single cloud technology vendor, or perhaps a single team of vendors. Among other factors, critics contend a single-award approach would deprive the military of much of the innovation that’s certain to take place over the next several years in the dynamic, competitive commercial marketplace for cloud services.

In a May report to Congress, the department justified that decision by saying its plans for rapidly moving Defense applications to the cloud would be delayed if it were to issue a multiple-award contract, because it would have to conduct separate competitions each time it issued a task order.

“That pace could prevent DoD from rapidly delivering new capabilities and improved effectiveness to the warfighter that enterprise-level cloud computing can enable,” officials wrote.

Defense officials have not indicated whether that stance has changed since Deasy’s arrival, nor have they given any hints as to what’s behind the delay of the final RFP.

Deasy himself has made few public remarks since he became CIO in May, though he spoke in general terms about his vision for the job, including cloud computing, during remarks at AFCEA’s defensive cyber operations symposium in Baltimore about a week after taking office.

“Cloud is one of the most iterative things I’ve ever been involved with in 36-plus years,” he said. “First of all, there’s not a singular thing that cloud is. There’s a lot of different things to what makes up a cloud, and so one of the things I find you have to do early on to get clear what conversation you’re in … and to understand this is not a case of trying to lift out of your old world and suddenly trying to drop into a new world. But this is the most phenomenal opportunity I think we’ve ever experienced, to be able to look at your legacy estate and re-engineer.”

Read more of the DoD Reporter’s Notebook.

]]>
https://federalnewsnetwork.com/cloud-computing/2018/07/deasy-takes-control-of-jedi-but-fate-of-dods-cloud-steering-group-is-up-in-the-air/feed/ 0
What’s next for FedRAMP? Automation, new authorizations later this year https://federalnewsnetwork.com/cloud-computing/2018/06/whats-next-for-fedramp-automation-new-authorizations-and-more-later-this-year/ https://federalnewsnetwork.com/cloud-computing/2018/06/whats-next-for-fedramp-automation-new-authorizations-and-more-later-this-year/#respond Mon, 25 Jun 2018 16:20:08 +0000 https://federalnewsradio.com/?p=1955122 FedRAMP has some major changes in the works that will benefit both federal agencies and cloud providers. From automated authorization processes to updated requirements and training for third party assessors, FedRAMP is looking to make it easier than ever to align federal agencies looking to transition to a secure cloud environment with cloud providers offering solutions.

At the June 13 ATARC Cloud Summit, Matt Goodrich, director of the General Services Administration’s FedRAMP, detailed some of the changes FedRAMP has coming in the near future. Not least of these, FedRAMP is, like many other government organizations, looking to enhance its services with automation.

FedRAMP has been working with the National Institute for Standards and Technology to begin implementing OSCAL, its automated control language for security authorization. Currently, federal agencies struggle with the fact that security controls are presented in ways that are subjective, open to interpretation, and require manual data entry in order to be used. The existence of multiple regulatory frameworks within a single agency compounds the issue.

That’s why NIST is creating a standardized control language as an objective and machine-readable format for the representation of security controls. This would help speed the process along, create interoperability, better define specifications and enable automation.

“We’ve been partnering with NIST to make sure that they have enough resources to speed that up a bit to make sure we can try and get something out by the end of this fiscal year, at least in a minimally viable way, to begin testing that out,” Goodrich said. “We think that’ll really help agencies transform the way they’re doing their work by making sure they can use whatever tool they want to use, and automate whatever they can in that process to do the authorizations.”

It’s important to Goodrich that agencies be able to use whichever tool they prefer. He said no one wants to open FedRAMP’s System Security Plan, which is just too large to be practical. Goodrich says even he navigates it using the “ctrl-F” search function.

Instead, FedRAMP wants agencies to feel comfortable with the tools they use and secure in the knowledge that the data will be transferable. OSCAL will help accomplish this.

It should also help FedRAMP amplify the effect it’s having on the federal push to secure cloud environments. Last year, Goodrich said FedRAMP’s seven federal employees attended more than 750 meetings, and have helped the government avoid spending more than $170 million per year on cyber. FedRAMP currently covers one-third of the world’s internet traffic with providers, and covers around 5 million assets.

Automation should allow them to expand on those effects.

FedRAMP is also going to be increasing the number of authorized cloud services.

“This is obviously something that everybody wants,” Goodrich said. “The more cloud services you have for agencies, the more options you have, the more things you can move to the cloud. For vendors, the more that are in [the program], the more agencies can use their services.”

And this isn’t just generalized cloud services; Goodrich said FedRAMP currently has 15 tailored authorizations in process. Three have already happened, and there’s a lot of momentum in this area. In tailored authorizations, agencies can partner with vendors and act as an independent auditor.

But if that doesn’t appeal to an agency, FedRAMP is also planning on updating requirements for its 3PAOs (third-party assessment organizations), which it hopes to release later this month.

In particular, it’s looking to bolster the program by adding a hands-on testing exercise for individual assessors. It wants to start seeing accreditations around individual assessments, not just organizations.

“Right now we accredit individual organizations, but don’t really go down to the assessor level,” Goodrich said. “And so what we’re looking to do is enhance the program to ensure that all of our assessors are authorized as well.”

FedRAMP plans to roll this out over the next few months. Meanwhile, FedRAMP just released its new training platform earlier this month, comprised of 300-level courses. Goodrich said FedRAMP wanted to respond to all the demand for more training.

It also just released authorization boundary guidance, which teaches agencies how to be aware of their smaller systems, and how federal data can be affected by the interaction with larger systems.

Add to all of that two new playbooks FedRAMP intends to release later this year, and Goodrich said they’re going to have to make it easier to find documents on the website.

]]>
https://federalnewsnetwork.com/cloud-computing/2018/06/whats-next-for-fedramp-automation-new-authorizations-and-more-later-this-year/feed/ 0
$50B Alliant 2 IT services contract gets the go-ahead for agency use https://federalnewsnetwork.com/reporters-notebook-jason-miller/2018/06/50b-alliant-2-it-services-contract-gets-the-go-ahead-for-agency-use/ https://federalnewsnetwork.com/reporters-notebook-jason-miller/2018/06/50b-alliant-2-it-services-contract-gets-the-go-ahead-for-agency-use/#respond Mon, 18 Jun 2018 13:16:20 +0000 https://federalnewsradio.com/?p=1945444

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

The Government Accountability Office still is not required to follow the Administrative Procedures Act.

The General Services Administration’s next great IT services governmentwide acquisition contract finally is through the protest gauntlet and soon will be ready for use. And the Treasury Department makes a surprising decision about the future of one of its multi-billion dollar multiple-award IT contracts on the heels of GSA getting the go-ahead for its contract.

These three events are part of a busy last several weeks in the federal acquisition community.

Federal procurement experts say the U.S. District Court’s decision that GAO continues to be exempt from the APA isn’t surprising. The act waives sovereign immunity only with respect to suits seeking relief other than money damages and challenging the action or inaction of an “agency.”

The court said GAO is not an “agency” by the definition in the law since it’s part of the legislative branch.

Rob Burton, an attorney with Crowell & Moring and a former deputy administrator in the Office of Federal Procurement Policy, said the district court’s decision reinforces that GAO is a unique entity and the rules governing it are dissimilar than any other.

“We who work with GAO regularly know it’s not part of executive branch and the rules that apply to the executive branch do not always apply to legislative branch,” he said. “But it’s fairly rare to have a disagreement with GAO over redactions to public protest decisions. Most of time GAO is pretty good about it and have never seen them act in arbitrary way.”

The case the U.S. District Court said it didn’t have jurisdiction to rule on was from Pond Constructors, which was unhappy with GAO’s decision to publish a protest decision information that Pond says is commercially confidential, including bottom-line prices, adjectival ratings and past performance information.

“Pond alleges that GAO’s refusal to redact this information from the decision is arbitrary and capricious in violation of the Administrative Procedure Act,” the court states.

Bill Shook, a long-time procurement attorney, said the decision reinforces a major hole in the federal acquisition system — vendors have no recourse if GAO rules against them.

“I’ve always complained that GAO has no procedures for appeal on decisions made by an attorney. Redactions are a perfect example of where I believe data has a commercial valuable under the Trade Secrets Act and there isn’t anything you can do to stop them from publishing it,” he said. “If GAO says no to your request, there is no review of that decision except for Congress and that’s not going to be successful.”

Shook said he just had a recent situation with a client in which they were concerned if their proprietary information was made public, they would lose what they see is a competitive advantage over their competition.

“If GAO decides to put out that data and their competitors can use that same process and my client loses those trade secrets,” he said. “You are relying upon the reasonableness of the GAO attorney. I got some of that information redacted, but not all of it.”

On the other hand, Shook said if a vendor submits a protest to the Court of Federal Claims and a judge makes a decision about redactions that the parties don’t agree on, there is an appeals process to higher court.

Burton added any attorney has to view their relationship with GAO as a partnership and not an adversarial relationship.

“You have to be persuasive in the fact that the information is proprietary and would hurt company if released. Sometimes that argument is hard to make,” he said. “Generally, people abide by the protective order and I’ve never seen a lot of proprietary information releases by counsel. It just doesn’t happen as general rule.”

GSA Alliant 2 beats protests

GSA also came out on top of a recent bid protest decision. Four vendors submitted complaints to the Court of Federal Claims after being left off of the $50 billion Alliant 2 IT services GWAC.

GSA awarded 61 vendors a spot on the unrestricted version of Alliant 2 in November.

Several unsuccessful bidders then took their cases to federal court, which ruled in GSA’s favor in early June.

GSA announced that Alliant 2 would be ready for other agency use starting on July 1.

“The court’s decisions further solidify the validity of GSA’s innovative procurement approach, highest technically rated offerors with a fair and reasonable price,” wrote John Cavadias, the GSA Alliant 2 GWAC Procuring Contract Officer  in the IT Services Contract Operations Division at the Federal Acquisition Service, in a blog post on GSA’s Interact site.

GSA said when it made the initial awards that Alliant 2 would be a key piece of the Trump administration’s IT modernization effort.

GSA made several changes to Alliant 2 from the initial contract awarded in 2009 including:

  • Flexibility as emerging technologies and the definition of information technology evolve.
  • Ancillary support (non-IT) permitted when it is integral to, and necessary for, the IT services-based outcome.
  • On-ramp and off-ramp provisions, ensuring retention of a highly-qualified pool of contractors.

The original Alliant contract has been popular with agencies, receiving 865 task orders worth more than $19 billion over the past 10 years.

The Army, Air Force, the Defense Department and the Homeland Security Department were the biggest users of Alliant by total obligations, while SAIC, Booz Allen Hamilton and Leidos were among the biggest winners among vendors by total sales.

IRS to use best-in-class contracts

On the heels of GSA’s win in court, Treasury decided not to recompete its large IT services contract known at TIPSS 4. Instead, the agency will let it expire in December and move all work to other governmentwide contracts.

“We, along with all other government procurement offices, have been mandated to acquire needed services and supplies from already established governmentwide and designated ‘Best-in-Class’ vehicles. Therefore, we are transitioning all current and future projects to GSA Schedule 70, GSA Alliant and/or Alliant-SB and other vehicles as we see appropriate,” the IRS writes on the TIPSS 4 website.

The IRS awarded the TIPSS 4 contract in 2010 to 33 vendors. It had a ceiling of $4 billion.

Bloomberg Government, which first reported the IRS’ decision, said the TIPSS-4 unrestricted and the TIPSS-4 small business have generated $3.8 billion in spending obligations since 2011. Bloomberg says 63 companies have won task orders under the contract, led by Deloitte Touche ($688 million), Booz-Allen Hamilton ($586 million), Northrop Grumman Corp. ($525 million) and IBM Corp. ($467 million).

The IRS’ decision not to renew TIPSS is a big deal given how popular it is compared to some of the BIC vehicles, where Bloomberg Government reported Treasury spent a combined $232 million on these vehicles since 2010.

OFPP is pushing agencies to reduce duplicative contracts by 13 percent by 2020 under the category management initiative. This is among the first major wins for that effort.

OFPP currently says there are 32 contracts determined to be best-in-class.

Bloomberg Government also reported in 2017 that the number of multiple award contracts dropped by 239 over the last five years, while spending continues to increase to over $111 billion a year.

Another potential win to look out for is the DHS Eagle 3 acquisition.

Soraya Correa, DHS’ chief procurement officer, said at a recent industry day sponsored by Washington Technology that she sees next generation contract as much different than the current one.

“Between GSA, OMB, and agencies we created this best in class procurements where they provide a lot of generic services so if there is a best in class procurement that fulfills our needs we will go to that first. We will see what they offer and see what’s out there,” Correa said. “We do have plans and are working with the DHS CIO’s office to create a follow-on to Eagle and Flash [agile procurement vehicle] and it will be a combined procurement that is more uniquely tailored to DHS’ needs around what we need to do now and in the future, and what our components needs are. It will not look exactly like Eagle 2. It will probably be pretty different.”

Correa said she envisions the same thing for the DHS First Source IT products contract as well.

The DHS IT Category management council, made up of staff from the CIO and CPO offices, are identifying what things the agency needs to work on, including assessing current best-in-class vehicles and talking about requirements the agency needs for the future.

Correa gave no specific timeline for Eagle 3, but the current contract expires in 2020 so acquisition planning is underway.

Read more of Reporter’s Notebook

]]>
https://federalnewsnetwork.com/reporters-notebook-jason-miller/2018/06/50b-alliant-2-it-services-contract-gets-the-go-ahead-for-agency-use/feed/ 0
House lawmakers express faith in IT modernization, but send a message too https://federalnewsnetwork.com/reporters-notebook-jason-miller/2018/06/house-lawmakers-express-faith-in-it-modernization-but-send-a-message-too/ https://federalnewsnetwork.com/reporters-notebook-jason-miller/2018/06/house-lawmakers-express-faith-in-it-modernization-but-send-a-message-too/#respond Mon, 18 Jun 2018 13:15:45 +0000 https://federalnewsradio.com/?p=1945440 House lawmakers offered a strong show of confidence in the Technology Modernization Fund (TMF) by allocating $150 million for fiscal 2019.

The House Appropriations Committee approved the Financial Services and General Government fiscal 2019 spending bill last week with the 50 percent increase of funds over the 2018 level.

“The committee encourages GSA and the TMF Board established by the Modernizing Government Technology Act to prioritize and fund those projects that have the most significant impact on mission enhancement and that most effectively modernize citizen-facing services, including updating public facing websites, modernizing forms and digitizing government processes,” the committee writes in the bill’s report.

At the same time, the committee also increased the IT Oversight and Reform (ITOR) fund run by the Office of Management and Budget to $25 million from $19 million as well as the Federal Citizen Services Fund to $55 million from $50 million this year. In all, the administration would receive about $230 million for IT modernization efforts.

At the same time, the committee warned the administration to offer more details on its plans.

“It is surprising, then, when staff charged with administering the new IT Modernization Fund refuse to share findings and respond to queries from the very committees that made the fund possible,” the report states. “The committee directs OMB and GSA to work more collaboratively with the relevant committees of jurisdiction in order to better evaluate the needs of agencies and opportunities for improving IT across government.”

The Senate Appropriations Subcommittee will mark up its version of the 2019 spending bill on Tuesday.

Still all of this bodes well for the administration’s IT modernization efforts, which kicked into second gear with three awards by the TMF Board.

OMB recently released more specifics on how the departments of Housing and Urban Development, Agriculture and Energy plan to use the money to help satisfy committee concerns.

HUD, which received the biggest share of the $100 million so far, will use its $20 million to move five legacy services to the cloud.

“The current system is used by 30,000 users to access 100 HUD grant, subsidy, and loan programs that disburse $27 billion per year,” states OMB’s fact sheet. “According to HUD estimates, the code modernization and migration will save $8 million annually, enabling payback and generating working capital to transform additional legacy systems. The new modern platform will be a Java cloud-based application suite that will cost less to maintain and will enable functional and technical enhancements to be completed more rapidly and at lower cost.”

USDA received $10 million to further its Farmers.gov portal.

“This is an opportunity to update legacy systems and re-engineer processes and systems to reduce improper payments, address and resolve repeated financial audit findings, and properly connect these agency systems to the USDA common financial system,” OMB states about USDA’s proposal. “Without this funding, USDA would need to delay integrating this part of the process into the consolidated Farmers.gov Citizen Experience Portal in a later year when funds became available. However, with support from the TMF the project can be conducted at the same time as other enhancements to the Portal, faster.”

Energy will use its $15 million in TMF funding to quicken its pace to consolidate and migrate 64 separate email systems serving more than 184,000 mailboxes across the agency to a single cloud-based software.

“TMF investment will be used to migrate the 26 remaining email systems that service 47,080 mailboxes. With this migration, DOE will secure large scale operational benefits and costs savings,” OMB states. “Without this funding, DoE would need to conduct the migration of the remaining systems using a piecemeal approach, subject to fund availability. DOE anticipates it will have a greater ability to serve its mission more quickly across sites and capabilities, which will positively impact the American people. The operational benefits of this project include cost savings, increased efficiency, improved cyber posture and decreased operational risk.”

OMB is encouraging agencies to continue to submit proposals for a share of the remaining $55 million in the TMF for 2018. To that end, the CIO Council launched a website to help promote the selection criteria, provide documents and templates and answer common questions.

The board received nine proposals from seven agencies for this first round of projects, and many in the community have said the quality was lacking. If that is true, plus the House committee’s report language promoting a specific type of project gives prospective agencies a lot of good intelligence to win future funding.

Read more of Reporter’s Notebook

]]>
https://federalnewsnetwork.com/reporters-notebook-jason-miller/2018/06/house-lawmakers-express-faith-in-it-modernization-but-send-a-message-too/feed/ 0
Thornberry says DoD needs to do its homework early on cloud https://federalnewsnetwork.com/cloud-computing/2018/06/thornberry-says-dod-needs-to-do-its-homework-early-on-cloud/ https://federalnewsnetwork.com/cloud-computing/2018/06/thornberry-says-dod-needs-to-do-its-homework-early-on-cloud/#comments Mon, 18 Jun 2018 12:23:00 +0000 https://federalnewsradio.com/?p=1942928 Congress wants the Defense Department to do something parents can never get their kids to do: finish their homework early.

The cloud is a contentious subject right now and as the Pentagon prepares to award some serious funds to cloud services, Congress wants to know it’s doing it right.

That’s why House Armed Services Chairman Mac Thornberry (R-Texas) says the House version of the 2019 defense authorization bill is fencing off funds for the JEDI contract until Congress gets some concrete answers on cloud.

“I really want to know that they have thought through not just this particular contract, but where cloud is going throughout the department,” Thornberry said, during a June 14 meeting with reporters in Washington. “We are going to pay close attention to it not just because it’s a lot of money and because it’s a shift in the way they do business, but because cloud computing is going to be really important to the future of the military.”

Thornberry said cloud will be especially important in the implications of artificial intelligence, weapons systems and even management of the department.

The chairman said he wants to make sure the cloud DoD invests in is searchable by auditors and its databases are well organized.

“The biggest problem in not having an audit is you have so many databases that won’t talk to each other. This has implications for the management of the department as well as the development of military capabilities, so it’s important to get it right,” Thornberry said.

The Armed Services Committees implemented hundreds of acquisition reforms over the past few years. Part of the reforms were to make sure DoD spent more of its time figuring out acquisition projects before it started spending money.

Congress hoped by doing that the programs would be better planned, more efficient and move faster.

Thornberry said Congress is being especially careful about that as it moves forward with the JEDI contract, hence the fencing of funds.

Fencing off funds is “always used as a tool either to get information from the department or you are trying to get their attention that you have concerns that you think [DoD is] not fully satisfying and it’s a way to highlight it to senior leaders’ attention because presumably if there is a fence on money, the leadership of the department becomes aware of that,” said Andrew Hunter, a senior fellow at the Center for Strategic and International Studies.

The House 2019 defense authorization bill prohibits DoD from using 50 percent of the funds authorized for JEDI until the defense secretary can provide Congress with information sufficient to conduct oversight of the acquisition.

“The committee is concerned with the lack of information supporting the planned acquisition of JEDI from a single commercial provider. This includes lack of detail regarding security requirements and associated costs, anticipated cost-savings, migration costs, and how the Department intends to maintain the ability to leverage the latest cloud computing capabilities and preserve the ability to transition workloads and data to other providers,” the House bill report states. “Additionally, the committee has not been provided with details on customer capability requirements or how JEDI impacts current cloud computing services and other activities. … The committee expects the Department to provide sufficient information necessary for the conduct of oversight responsibilities”

The House Armed Services Committee isn’t the only one upset about JEDI.

The Senate Armed Services Committee and the House Appropriations Committee want answers too.

The Senate bill includes language requiring DoD to lay more technological groundwork within its own networks. The agency is preparing to transition key IT systems from government-operated infrastructure to off-site cloud hosts.

Senators expressed concern that DoD’s Cloud Executive Steering Group, the body that’s leading the JEDI procurement, has been so preoccopied with planning the contract itself that it’s neglected the “enabling activities” and “preconditions” the department will need to address if it’s going to successfully support its IT systems once they’re in the cloud.

“The committee emphasizes the importance … of modernizing networks by adopting advanced commercial capabilities,” lawmakers wrote in a report accompanying the bill. “Network modernization is essential for cybersecurity, supporting service-level agreements for cloud services, ensuring efficient cloud access, and consolidating networks.”

Specifically, the legislation would demand that DoD draw up a plan to “rapidly acquire advanced commercial network capabilities, including software-defined networking, on-demand bandwidth and aggregated cloud access gateways, through commercial service providers.”

The plan would have to be delivered to Congress within three months of the bill’s enactment, assuming the Senate language survives the final House negotiation process.

The House defense appropriations bill restricts DoD from moving any applications or data to JEDI until it supplies another detailed report explaining how the new single-award contract fits in with the rest of the department’s cloud activities.

The moratorium would stay in place from whenever the appropriations bill is enacted until 90 days after the Pentagon supplies Congress with more details on its overall cloud spending.

Specifically, it would demand that the department deliver a plan to set up a new budgeting system that accounts for all the funds the military services and Defense agencies will use to migrate their data and systems to any cloud environment, including JEDI.

]]>
https://federalnewsnetwork.com/cloud-computing/2018/06/thornberry-says-dod-needs-to-do-its-homework-early-on-cloud/feed/ 1