Open Data/Transparency – Federal News Network https://federalnewsnetwork.com Helping feds meet their mission. Tue, 05 Jul 2022 14:30:37 +0000 en-US hourly 1 https://federalnewsnetwork.com/wp-content/uploads/2017/12/cropped-icon-512x512-1-60x60.png Open Data/Transparency – Federal News Network https://federalnewsnetwork.com 32 32 Five ways to improve FOIA estimated completion dates https://federalnewsnetwork.com/open-datatransparency/2022/07/five-ways-to-improve-foia-estimated-completion-dates/ https://federalnewsnetwork.com/open-datatransparency/2022/07/five-ways-to-improve-foia-estimated-completion-dates/#respond Tue, 05 Jul 2022 14:30:37 +0000 https://federalnewsnetwork.com/?p=4135117 As backlogs for Freedom of Information Act requests grew during the pandemic, some agencies found success limiting processing times. Now, those agencies are offering best practices to improve the public information request process and make it easier for records custodians to calculate estimated dates of completion (EDCs). 

FOIA, which celebrated its 56th anniversary on July 4, mandates agencies provide EDCs on all public information requests, although many agencies do not provide them, Alina Semo, director of the Office of Government Information Services (OGIS), said.

The agency’s annual meeting on June 29 comes after OGIS issued their annual Report for Fiscal Year 2021. OGIS is the congressionally mandated agency in charge of reviewing FOIA policies, procedures, compliance and improvement.

The report said OGIS handled 4,200 requests for assistance from both FOIA requesters and agencies. OGIS sees a fraction of the overall public information requests filed to various agencies. 

The FBI, alone, has about 30,000 incoming requests each year, Michael Seidel, the agency’s chief FOIA officer, said.

As agencies are still dealing with the fallout from the pandemic, OGIS reported the number of requests for OGIS assistance involving delays jumped 73%, from 220 cases in 2020 to 380 in 2021. In 85% of the requests about delays, the requester could not get an estimated date of completion from the respective agency. 

“Our assessment found that agencies were challenged even before the pandemic began to provide EDCs and the agency’s responses to such requests were mixed,” Semo said during the meeting.

The Office of Information Policy’s Summary of Agency Chief FOIA Officer Reports for 2021 said by the end of fiscal 2020, 34 agencies had their backlog increased by more than five requests. For example, the Department of Veterans Affairs had more than 1,000 requests backlogged in 2020, the agency’s 2020 annual report said. The VA closed their 10 oldest appeals in 2020. 

A request is backlogged when it is pending beyond the statutory time period for a response. For requests, the statutory time period is 20 working days from receipt of the request, unless there are “unusual circumstances,” as defined by the law, in which case the time period may be extended an additional 10 working days. 

Of the 35 agencies with backlogs, 14 processed more requests than the previous fiscal year, OIP said. Although, 26 medium and high volume agencies reported they reduced the number of requests in their backlog. 

FOIA officers from the Federal Emergency Management Agency, the Postal Service and the FBI laid out five tips agencies may want to consider implementing to reduce backlogs, including being proactive in alerting requesters when records custodians delay EDCs and creating negotiation teams.

Proactive communication about EDCs

Among the most common recommendations from the panelists was open and proactive communication between records custodians and requestees. 

Gregory Bridges, chief of the disclosure branch of the records management division at FEMA, said proactive communication begins with agencies providing an EDC. 

At FEMA, Bridges said getting record custodians comfortable with the concept of providing an EDC was the first struggle. He said agencies with similar issues should base the EDC off the time it would take to complete the request if they worked on nothing else.

“There’s nothing wrong with telling a requester ‘we think it’s going to be ready by this date. If we don’t think we can meet that date, we’ll definitely reach out.’ But you have to reach out. If you’re saying, ‘the 25th,’ by the 20th or the 24th, you should have an idea of if you can meet the 25th and if you know you can’t, let the requester know before the 25th,” Bridges said at the annual meeting.

In his time at FEMA, Bridges said he finds the main complaint from requesters is thinking they have been forgotten. “Even if they don’t like the date being extended, at least they know that you’re actively working on it,” he said. 

He also said providing a clear timeline and EDC to requesters may encourage them to reduce the breadth of the request if they request more records than they need and want the records by a certain day. 

“Even if you have to extend [the EDC], then that could be another opportunity to narrow the scope,” he said. 

Record custodians at FEMA find it helpful to explain to requesters exactly what they are looking for in case it’s more than they need. 

“One of the things we do at our agency is explain to requesters why searching for all of the emails with the word hurricane during hurricane season might produce more records than you’re actually looking for,” Bridges said

Similarly, Nancy Chavannes-Battle, the deputy chief FOIA officer at the USPS, said providing partial responses when files become available if requests are taking longer than originally estimated. 

At the FBI, an online tool can tell requesters what stage their request is in and direct them to a PIO and the negotiations teams, who can answer questions in order to keep communication open throughout the process.

Negotiations teams

The FBI’s negotiations teams review the files and interact with requesters to answer questions. In fiscal 2022, the FBI received over 14,000 emails and over 1,100 phone calls about requests, Seidel said.

“We find that a lot of our requesters engage with our public information officer to get more information about the request and that’s where the discussion about the EDC really happens,” Seidel said. 

Negotiations teams ask requesters what they are looking for such as a specific event, a date range or an interview in order to stop processing unwanted pages, which, in turn, provides records faster. 

“We’re able to serve more requesters and give more requesters more information more frequently,” Seidel said. 

Seidel said the negotiations process has eliminated the processing of over 66 million unwanted pages that were originally requested. 

Automated EDC organization 

The FBI FOIA office uses automated multitrack processing programs to estimate the average number of days it takes to complete a request. 

They organize their requests into four tracks based on page size: small, medium, large and extra large. At the agency, the small track includes requests between 1 and 50 pages and the medium track is 51-to-950 pages, although other agencies who implement similar processes can change track size as needed.

“We’ll look at those dates within those queues of the dates requests were opened, we’ll do the math and compare them to the dates they were closed and we’ll come up with that average number of days it takes to complete a request within that queue,” Seidel said. 

The FBI FOIA office runs those audits every six months. Seidel also said an original challenge when implementing the program was deciding the right frequency for running the audit.

Estimated dates of completion: They’re not just for the requestor

Although EDCs are required by the FOIA, Bridges says, they do not only benefit the requester. 

“You should be establishing EDCs, just for your office’s knowledge,” Bridges said. “It’ll help you gauge your output, what can you expect to go out the door. So even though this does benefit the requesters in a big way, it can also benefit your office from managing your requests in a big way too.” 

He said every agency should incorporate the EDC timeline into the processing of all their records.  

“If you put all of your requests on an EDC that can also help you factor in how long it’ll take you to work on a particular request. Because you’re considering your current workload,” he said. 

Chavannes-Battle, from USPS, said PIOs should break down what increases the time of requests when setting EDCs, such as requests needing to be referred to other offices, going through corporate communications and legal departments. 

Connect with agency leadership

The OGIS assessment found support from agency leadership to be critical to the success in meeting requirements, such as providing EDCs.

At USPS, Chavannes-Battle says their staff training program is an important way PIOs connect with agency leadership to function smoothly. 

At FEMA, Bridges says training staff to work with attorneys at agencies helps set EDCs and return records quicker “especially when you’re dealing with senior managers who aren’t familiar with the FOIA process, and you’re coming into their program, trying to tell them why they need to make our work a priority against their work.” 

“It really is just understanding what that particular manager or leader cares about when it comes to the four year process,” he said. “Do they care that they’re in compliance? Some do, some don’t. Oftentimes, they care about not getting in trouble. They care about not having to spend money, they care about not having to get sued.” 

In FEMA’s FOIA department, PIOs treat senior managers as enforcers to get their staff to comply with FOIA requirements, he said.

Benefits of recommendations

FEMA began to implement the new procedures in 2018. Bridges says the agency has only been sued twice over FOIA records since implementation. While appeals over denied requests were previously between 1,200 to 1,700, they fell to around 45 each year since. 

“It isn’t because we don’t get those kinds of people [watchdogs], it’s because of implementing this new procedure and really establishing these response dates,” Bridges said. “That’s part of setting the expectations with the requesters so once we got people familiar with it, we really started to see appeals and challenges to our final responses reduce significantly.”

]]>
https://federalnewsnetwork.com/open-datatransparency/2022/07/five-ways-to-improve-foia-estimated-completion-dates/feed/ 0
Current, former Hill staffers say centralized authority needed to modernize Congress https://federalnewsnetwork.com/congress/2022/06/current-former-hill-staffers-say-centralized-authority-needed-to-modernize-congress/ https://federalnewsnetwork.com/congress/2022/06/current-former-hill-staffers-say-centralized-authority-needed-to-modernize-congress/#respond Mon, 27 Jun 2022 17:18:09 +0000 https://federalnewsnetwork.com/?p=4123922 The upside to Congress’ decentralized nature is that innovation can come from anywhere. The downside is that coordinating those innovations is hard.

Current and former Hill staffers say technology can and has solved many common problems for members of Congress, but they want to see members tap into more commercial-friendly platforms and give centralized authority to bodies like the Bulk Data Task Force, or the House Digital Service.

Stephen Dwyer, senior adviser to House Majority Leader Steny Hoyer (D-Md.), pointed to solutions such as the Dome Watch and Dome Directory mobile apps, created by the office to help members of Congress, their staff and the public better track movements on the House floor. The 13-year-old private intranet DemCom for House Democratic staff was also redesigned last year with expanded access for Senate staff, mobile functionality and a bigger database of information.

But custom-built systems for “uniquely Congressional purposes,” as Dwyer said, are not all that’s recommended. He told the House Select Committee on Modernization last week, the programs are representative of what is possible when the legislative body coordinates its technology efforts, but that requires in-house digital staff for each office. He recommended hiring digital aides with programming and development skills for every member, in addition to more traditional political science and communications staffers.

“There’s just so much more that they need to do, even versus five, 10 years ago when I was in a congressional office,” Dwyer said. “A lot of that is in digital communications. Every office needs to not just take a bunch of pictures and post them on Twitter and Facebook but they have to do more technical Facebook Lives, they’ve got to take their boss live, there’s a lot of technical tasks that didn’t exist many years ago.”

But Dwyer said Congress needs to recognize the demand for these workers and compensate them appropriately. The House is raising its staff salary floor to $45,000, after essentially a decade-long pay freeze and record inflation made it difficult to attract and retain employees.

One of the Modernization committee’s recommendations last Congress was to create a common committee calendar portal to reduce scheduling conflicts. Vice Chairman William Timmons (R-S.C.) asked witnesses for suggestions to get the ball rolling on what he said could have a big impact on members and staff. Reynold Schweickhardt, Lincoln Network senior adviser, said an issue is that between the House clerk, the chief information officer and the committees themselves there is no clear button to push for technology needs. As such, another recommendation was to clearly focus the responsibility for legislative product.

“I think the other challenge that I alluded to is there’s no gatekeeper for scheduling projects. The CIO, they may be working on five to 10 projects, so they tell you they’re working on your project and they are, but they’re sort of shuffling things back and forth,” said Schweickhardt, who served at the Committee on House Administration for 13 years and the Government Publishing Office for eight years Versus a program-management kind of functionality that says, ‘What are the three things we want to accomplish in the next couple of months? Let’s knock ‘em out. Let’s figure out what the next set of important things are.’”

Dwyer said the foundation is laid. For several years, House rules have required all committees to post hearings and testimony in a central place, putting the body ahead of the Senate, but amplifying that with a more consumer-friendly version would help, he said.

Melissa Dargan, co-founder of AppMy LLC and a former Hill staffer, seconded the use of funds for a centralized scheduling platform. It’s a problem she tried to tackle when she launched the TourTrackr app to better manage constituent tour requests.

“From constituent tour requests to flown flag purchases, these important responsibilities were tracked using printouts, binders and excel spreadsheets. It was a fragmented, inefficient process. At the time, there were no digital alternatives that House offices were approved to use. So while these tasks seemed easy, they were tedious, repetitive and time consuming,” she said, adding that technological innovation does not need to come at the expense of security.

“I respect and understand that the House has high standards for new tech approval. Protecting security and personal identifiable information are critical to ensure the integrity of the institution. That said, upholding those priorities and creating a welcoming environment for new tech products can be done simultaneously,” Dargan said.

]]>
https://federalnewsnetwork.com/congress/2022/06/current-former-hill-staffers-say-centralized-authority-needed-to-modernize-congress/feed/ 0
After a long-term study, evidence-based decisions need trustworthy data https://federalnewsnetwork.com/open-datatransparency/2022/06/after-a-long-term-study-evidence-based-decisions-need-trustworthy-data/ https://federalnewsnetwork.com/open-datatransparency/2022/06/after-a-long-term-study-evidence-based-decisions-need-trustworthy-data/#respond Thu, 16 Jun 2022 16:54:10 +0000 https://federalnewsnetwork.com/?p=4105725 In the push for evidence-based decision making, data-driven studies are scrutinized more and more at federal agencies. Long-term studies that rely on multiple stakeholders are vulnerable to environmental changes, technological barriers and personnel cooperation — all of which have an outcome on the data.

Trust underpins evidence-based innovation, but for Teri Caswell, a broadband program specialist at the National Telecommunication & Information Administration, all parties still need to agree on their definition of trustworthiness.

“Is the evidence that’s been provided or sought after trusted because it’s been tried and true? Is it trusted because of the community and the audiences of the people who are touching it or defining it, or presenting it?” she said as part of the Performance Institute’s 2022 Government Performance Summit on Wednesday. “I personally believe that if I am part of a compilation of evidence or artifact that has been and there’s a delineation there as well, but if I can present it once and then, when asked, present it again in a different labeling, packaging, compilation — however you want to phrase it — it still has to be trusted.”

In other words, context matters.

Trust is also important to Shonda Mace, a project manager in the Texas General Land Office’s Community Development and Revitalization who has experience working with FEMA and federal Department of Housing and Urban Development on long-term disaster recovery. After Hurricane Harvey in 2017, her office is conducting regionalized flood studies, which she said were purposely regionalized because local communities often do not communicate with each other, sometimes because of a lack of trust. Her team has to be that reliable go-between.

“So one big thing we’re doing is we’re working with not just the communities, but also other state agencies, and federal agencies to break down silos and work together,” Mace said. “If you don’t have the trust amongst the other agencies and your partners, if you don’t have the trust in most communities, you’re not going to get the information you need to move this project forward.”

With long-term studies, it can be difficult to keep all stakeholders engaged over time. Mace said communities want fast answers and, after a natural disaster has passed, the energy for impact studies can fade. She said it takes a balance of not exhausting stakeholders with outreach but also not waiting so long between outreach that they forget about the study altogether. While multiple agencies in Texas are conducting similar studies to those of her team, they must be careful not to duplicate efforts funded by federal dollars or else relinquishing that money.

Caswell added that there should be client knowledge management in the background of long-term studies. Worldly and environmental considerations can change over the course of the study, such as budgetary cuts, political shifts or another entity assumes the program area.

“The positive side of that is, the more willing we are to look at evidence-based criteria to drive innovation, we should be seeking more than one or even 100 inputs to that innovation design, lest we become, a reputation of doing things in a vacuum and we didn’t consider 80% of our benefactors,” she said.

Her recommendation was to track the key words and phrases that change over the course of multi-year studies, a vocabulary list or checklist of sorts, to maintain some level of consistency in the data so that questions are adequately answered by the end.

She also spoke to the question of whether or not to share information as you go, as opposed to waiting until the end of a study to show stakeholders the data. The anecdotal knowledge subject-matter experts or senior advisors can potentially add to a report headed to an elected official, but it may be hard to digitize. In this case, Caswell said, it helps to have an iterative approach to informing stakeholders and reviewing the study with them before completion.

“I am done with the days of writing a summary report before we’ve even looked at the data. Let’s get it on a paper, let’s get it on report, get people around our camera or table, whatever it takes, and start recognizing what it does look like and are we on the correct path? And if we’re not, there’s your first [knowledge management] piece, right?” she said. “We went this way to prove or disprove the hypothesis, we’ve got a course correction we need to affect, we’re notifying all parties, we’re having a conversation, and we’re building the trust in the process, not just the report.”

However, she said, a major consideration that complicates data collection from stakeholders is the technological requirements involved when partnering with federal agencies. Normalizing technology at the federal level takes a long time, and as cybersecurity requirements increase so do the possible challenges for stakeholders to submit data for studies. Mace’s office encountered this on their contract with the Army Corps of Engineers to review modeling. Sharing large items via box or SharePoint were no longer options as USACE’s file sharing system was not large enough.

Yet, the Texas General Land Office saw this as an opportunity to innovate. Mace said the GLO is working on a new Texas Disaster Information System with the University of Texas and Texas A&M University, “where our vendors can put models into there and USACE can go in and access them and get and get those models.”

]]>
https://federalnewsnetwork.com/open-datatransparency/2022/06/after-a-long-term-study-evidence-based-decisions-need-trustworthy-data/feed/ 0
New VA portals provide simplicity, transparency to vendor interactions https://federalnewsnetwork.com/shared-services/2022/06/new-va-portals-provide-simplicity-transparency-to-vendor-interactions/ https://federalnewsnetwork.com/shared-services/2022/06/new-va-portals-provide-simplicity-transparency-to-vendor-interactions/#respond Mon, 13 Jun 2022 12:44:25 +0000 https://federalnewsnetwork.com/?p=4096213 The Department of Veterans Affairs is trying to streamline how contractors interact with the department. Last month it announced the Pathfinder site, a new digital one-stop shop for contractors looking to work with the VA. On June 7, that website officially went live.

“This is a focused point of entry to selling and innovating with VA for our industry partners,” said Michael Parrish, VA’s chief acquisition officer, in a press release. “It fills a gap we’ve found in the acquisition lifecycle by creating the fusion of acquisition and innovation with this intelligent system.”

Parrish said last month that the new Pathfinder site will help vendors that haven’t worked with VA before getting certified through SAM.gov. But the site will also pre-filter to show only VA-specific solicitations.

“One of the great things about [the Pathfinder tool] is that it does provide a connection to that more innovative area,” said Charles Worthington, chief technology officer for VA’s Office of Information Technology. “So in cases where it’s a solution that the community thinks might help solve a problem that VA has, there is a path that it’ll lead you down, that can kind of flag you into that.”

Luwanda Jones, deputy chief information officer at VA, said during a VA Advanced Planning Brief to Industry on June 8 that the site will also facilitate vendor engagement scheduling. Eventually, she said, it will replace the VA’s IT Vendor Management Office scheduling email.

But Pathfinder isn’t the only way VA is trying to make it easier for vendors to work with them. Worthington said VA is trying to build a culture of transparency on the IT side, especially with vendors, in order to make it easier for them to integrate solutions. That’s evidenced in the new Lighthouse program, Worthington said.

Previously, he said, the team behind Lighthouse was primarily focused on vendors building third-party applications on top of VA services. For example, that might include a veteran linking their VA medical records to an approved health care app on their phone.

But now that team is turning its focus to internal interfaces as well.

“You’re going to be seeing more about this in the coming weeks and months. But what we’re really hoping to do is for all the different systems that run within our VA internal environment, we want to document the interfaces that those systems provide, and make it easy for teams to use those interfaces,” Worthington said. “We want to focus on making it easy for teams to leverage existing capabilities that are existing in other systems. And Lighthouse is a key way that’s going to help us do that. So more to come on this but keep your eyes open. And really, teams should be thinking about the core transactional capabilities of whatever system you’re working on. Exposing those to other systems in a standard way with an API is really the approach that has been most successful for integrating systems at VA, and we’re going to be leaning into that even more with our Lighthouse Developer Experience platform.”

Worthington also mentioned a VA design portal that’s intended to provide style and format guidance to contractors working with VA, in order to match the look and feel of the VA website. The idea is to ensure consistency across all communications involving the VA. Worthington said that can range from as simple as whether or not to capitalize the V in veterans, to technical minutiae like fonts, margins and color palettes.

Finally, Worthington discussed a shared service for notifications that VA is beginning to implement on its internal network. It will allow vendors to send personalized notifications to veterans via email or text. So, for example, they can let veterans know their application was received, or their prescription was updated. Worthington said the service is already integrated with VA’s veteran profile databases, as well as with its veterans preference engines. It also has a high level of trust with email providers, meaning it’s more likely to bypass spam filters.

]]>
https://federalnewsnetwork.com/shared-services/2022/06/new-va-portals-provide-simplicity-transparency-to-vendor-interactions/feed/ 0
FOIA advisers recommend independent review into how DHS handles immigration record requests https://federalnewsnetwork.com/agency-oversight/2022/06/foia-advisers-recommend-independent-review-into-how-dhs-handles-immigration-record-requests/ https://federalnewsnetwork.com/agency-oversight/2022/06/foia-advisers-recommend-independent-review-into-how-dhs-handles-immigration-record-requests/#respond Thu, 09 Jun 2022 18:55:42 +0000 https://federalnewsnetwork.com/?p=4096355 Congress should fund an independent review into how the Department of Homeland Security handles immigration records requests, one of the largest drivers of the Freedom of Information Act backlog, according to an advisory group.

The FOIA Advisory Committee approved its final report for the 2020-2022 term on Thursday. The committee is now seeking applications for new members ahead of the first meeting of the 2022-2024 term in September.

The committee’s final report digs into a range of legislative, technology and process recommendations, including suggestions for how to improve “first-person requests,” where individuals seek access to government records on themselves.

The report notes DHS accounts for about half of all FOIA requests received by the federal government, with most involving immigration records scattered across U.S. Citizenship and Immigration Services, Immigration and Customs Enforcement, and Customs and Border Protection.

Alien Files, or A-Files, are the largest category of first-person FOIA requests, according to the report. The immigration files are crucial to those requesting immigration benefits or seeking to defend themselves in court proceedings.

“This use of FOIA for administrative discovery is a significant driver for the USCIS backlog,” the report states. “As a first-party alternative to the FOIA queue, we recommend that USCIS extract A-files and establish a fast-track processing alternative to FOIA.”

The report recommends a “non-governmental entity with expertise in research and development” assess DHS processes, workforce, and existing technology related to A-files and FOIA requests.

During Thursday’s meeting, the committee voted to amend the final report to recommend that Congress provide supplemental funding to DHS to support the independent assessment.

“We do know that there are a lot of unfunded mandates that we put on our FOIA offices,” Michael Morisy, MuckRock’s chief executive and a member of the advisory committee, said during the meeting. “I think it would help actually make sure that this happens, because otherwise it would be tough to pull off because it will require dedicated funding that is going to be hard to find. I think it would also give Congress a chance to provide oversight if they’re earmarking this funding.”

DHS FOIA backlog

DHS gets the most FOIA requests out of any individual agency, receiving an average of 250,000 requests annually between 2009 and 2019, according to a DHS FOIA backlog reduction plan released in March 2020. The majority of the requests are related to immigration records.

But the agency was able to reduce its FOIA backlog by 25,102 cases last year, a more than 30% reduction, according the 2022 DHS Chief FOIA Officer report.

DHS has targeted its processes around immigration records requests in particular. In FY-21, USCIS reduced its A-file backlog to 58 requests, a 99.6% reduction, according to the report.

USCIS is now able to directly access, process and request ICE documents. An agreement between the two agencies gives ICE 48 hours to review and approve documents before they are returned to USCIS and combined into an A-file to share with a requester.

“This agreement does away with the need to refer documents to ICE and provides greater customer service by reducing processing times and delivering one consolidated response to the requester,” the report states.

DHS moving to new FOIA system

The advisory committee’s recommendations come as DHS shifts to a new FOIA processing system. The agency says the new system will allow it to process records faster and more accurately. The new system will provide FOIA professionals with “advanced e-discovery tools that are common in the private sector,” according to DHS.

The move will start with the Privacy Office at DHS headquarters in late June. The schedule for other DHS components is yet to be decided.

“We are staggering transition to limit the disruption in service,” a DHS statement says. “The length of the transition, and the date when other FOIA processing centers begin the transition, depends on the time it takes to migrate data.”

The DHS Chief FOIA Officer report provides some more detail on the new system, which is contracted out through the DHS Privacy Office.

“The Privacy Office required that the new solution include several key features that will decrease the administrative burden and improve processing,” the report states. “Critically, the solution provides users with access to powerful e-discovery tools that will improve processor’s ability to efficiently review the large volume of electronic records often associated with complex FOIA requests. The solution will also be interoperable with other FOIA tools in use at the department and allow requesters to submit requests online more efficiently. The solution also provides video redaction capabilities, which we expect will be increasingly important as the department increases its use of body worn cameras.”

]]>
https://federalnewsnetwork.com/agency-oversight/2022/06/foia-advisers-recommend-independent-review-into-how-dhs-handles-immigration-record-requests/feed/ 0
Study looking at how agencies make their crucial legal documents publicly available https://federalnewsnetwork.com/open-datatransparency/2022/06/study-looking-at-how-agencies-make-their-crucial-legal-documents-publicly-available/ https://federalnewsnetwork.com/open-datatransparency/2022/06/study-looking-at-how-agencies-make-their-crucial-legal-documents-publicly-available/#respond Fri, 03 Jun 2022 16:27:32 +0000 https://federalnewsnetwork.com/?p=4088024 var config_4088280 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/dts.podtrac.com\/redirect.mp3\/pdst.fm\/e\/chrt.fm\/track\/E2G895\/aw.noxsolutions.com\/launchpod\/federal-drive\/mp3\/060322_Kwoka_web_c5lb_a6d5157a.mp3?awCollectionId=1146&awEpisodeId=9f595a28-b7f0-4567-bc8c-3443a6d5157a&adwNewID3=true&awNetwork=322"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2018\/12\/FD1500-150x150.jpg","title":"Study looking at how agencies make their crucial legal documents publicly available","description":"[hbidcpodcast podcastid='4088280']nn<em>Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive\u2019s daily audio interviews on\u00a0<\/em><a href="https:\/\/itunes.apple.com\/us\/podcast\/federal-drive-with-tom-temin\/id1270799277?mt=2"><i>Apple Podcasts<\/i><\/a><em>\u00a0or\u00a0<a href="https:\/\/www.podcastone.com\/federal-drive-with-tom-temin?pid=1753589">PodcastOne<\/a>.<\/em>nnThe Administrative Conference of the United States is concerned with how federal agencies carry out laws. Now it's launched a study to examine the laws governing disclosure of agency legal activities. And whether there's a way to streamline legal disclosure. It's an effort that potentially affects every single agency. For more, the\u00a0<a href="https:\/\/federalnewsnetwork.com\/category\/temin\/tom-temin-federal-drive\/"><em><strong>Federal Drive with Tom Temin<\/strong><\/em><\/a> turned to one of the study group members of The Ohio State University law professor Margaret Kwoka.nn<em>Interview transcript:<\/em>n<blockquote><strong>Tom Temin: <\/strong>So this study group is looking at things to recommend to Congress to do what exactly?nn<strong>Margaret Kwoka:\u00a0<\/strong>As it stands right now, there are a variety of statutes that mandate that agencies publish certain types of materials on their websites, or make them affirmatively available. Some of them have to be published in the Federal Register. And so we're looking at sort of a variety of statutes that impact the question of what agencies must disclose on a proactive basis. And that includes FOIA, of course, the Freedom of Information Act, it includes the E-Government Act, it includes the Federal Register Act, among others, and certainly there are substance-specific areas as well. And each of them can impact an agency's obligations. But we're looking specifically at sort of a subset of records that those statutes affect. And by agency, who were sort of looking at agency legal materials, and the guiding principle for us is that, we have a principle against secret law, that agency law should be public, that the public should know what agencies do that set forth, binding rights and obligations of those that are subject to the agency's authority, as well as other kinds of documents that constrain agency action or explain agency's decision-making guidance for stakeholders in the general public. And so the idea is to identify key statutory changes that Congress should consider to kind of clarify, potentially strengthen in some areas, and harmonize some areas where statutes are either unclear or overlapping, or sometimes even in some amount of tension with another.nn<strong>Tom Temin:\u00a0<\/strong>Well, what is the problem that you're seeking to fix with disclosure laws and to some extent, I guess, disclosure procedures that derive from those laws? What's the problem?nn<strong>Margaret Kwoka:\u00a0<\/strong>Well, there's sort of a variety of types of problems that we may well be taking up, though, I will say we're pretty early in the process. We have a public request for information and comment out already. We also, we're convening sort of a consultative body that will be giving us input as well. And so it may be premature to give you a definitive list, but I can give you some examples of the kinds of things that we're looking at. So some of them sound a little bit like technicalities. We have some statutes that refer to each other and do so in a way that makes agencies' obligations frankly, confusing. And as a result, we may be able to recommend some changes that would clean up those obligations and make it clear both for agencies and the public. There are areas where a statute mandates disclosure of a certain type of material, but the definitions are very unclear. And in lots of those areas, there isn't a lot of judicial interpretation. And agencies are sort of left to their own devices to come up with what those mean, or might follow DOJ guidance. So there are areas where we think there is room to strengthen the obligations so that more agency legal materials are readily available to the public in a way that the public will be able to access them. So this is an effort to look at the issue very comprehensively, both in terms of how to make the statutes work well with each other, and also how to think about this sort of obligations as one set of obligations instead of the sort of, more or less piecemeal approach that has been taken with various statutory changes.nn<strong>Tom Temin:\u00a0<\/strong>We're speaking with Margaret Kwoka, she teaches law at Ohio State University, and she's part of a study group convened by the Administrative Conference of the United States. And when you say legal materials, this is more than just case law or materials related to lawsuits, by or against the government?nn<strong>Margaret Kwoka:\u00a0<\/strong>Absolutely, in fact, primarily not those sorts of materials, because of course, those materials would already be typically available in a public forum, because the court system would have a filing system that would be accessible. So for agencies in litigation, most of those records are already available. We are looking at the question of whether we might be including recommendations about agency settlements in litigation, which are not always public. But that sort of one very small subset of the issues. I would say more than that, you know, we're looking at, in particular, rules, I think are already clearly defined in the statutory disclosure requirements. But we are looking at whether all those requirements still are adequate in terms of where the materials need to be published, for example, or how they should be published so that the public can best access them. But more than that, guidance documents, what needs to be put on a website versus what should be published in the Federal Register, and also the scope of what guidance documents really can vary widely from agency to agency, but that could include things like manuals or policies for inspectors or enforcement options, penalties, etc. We're also looking at adjudication materials, so as it stands right now, the prevailing interpretation of FOIA's requirement that agencies publish orders made in the adjudication of cases is that that would only really be applicable to decisions that the agency considers precedential and binding on future decision making. We're looking at the question of whether there may be other broader sets of those kinds of decision-making that should be in fact, affirmatively published instead of -nn<strong>Tom Temin:\u00a0<\/strong>Sure, because what's routine to one person might be precedential to someone else from the external side?nn<strong>Margaret Kwoka:\u00a0<\/strong>That's right. And I think sometimes, we've seen instances where agencies may be treating certain issue commonly throughout a volume of decisions where no single one of them has been designated as residential or binding. But that's the agency practice. And there may not be other records out there that really disclose to the public how that issue is being treated. Is there a way to define this set of records better, in a substantive way, as well?nn<strong>Tom Temin:\u00a0<\/strong>And would this, maybe somehow, when this is all resolved, would it speed up the FOIA process by taking away some of the discretion that FOIA officers spend so much time trying to exercise?nn<strong>Margaret Kwoka:\u00a0<\/strong>That is one of our hopes is to discuss sort of the costs and benefits of any of the recommendations that we do eventually make, including, and we are looking at examples then and some of the empirical work that we can point to, showing that proactive disclosure of categories of records, in some instances, not in every instance, but in some instances, can and that cut back the number of individual FOIA requests an agency has to process, because those records are already being made available. And so I think there is a potential administrative savings, both dollar savings, but also simply the kind of bureaucratic processing of the bogging down of that other set of obligations as well.nn<strong>Tom Temin:\u00a0<\/strong>And what is the timeline here? Your study group will then make recommendations to ACUS, to the Administrative Conference [of the United States], then they would be the ones to actually make them directly to Congress?nn<strong>Margaret Kwoka:\u00a0<\/strong>That's correct. The consultant group is looking at trying to get as much public input as we can, input from our consultative group of experts from agency reps and also from public members. But also, you know, we're trying to get that done maybe over the summer, and then looking at using the fall to get through the ACUS committee process and or ACUS to consider these recommendations for them to potentially adopt.nn<strong>Tom Temin:\u00a0<\/strong>And anyone can comment, not just the FOIA officers, but people that might just be believers in open government that follow these things.nn<strong>Margaret Kwoka: <\/strong>Absolutely. And we've been trying to publish that notice, request for comments very broadly, and get it out to as many of the stakeholders as we can directly but certainly members of the public can, and we encourage them to comment. We are looking to push out this request to all stakeholders and as widely as possible. This input is a really important part of the process and it's a large part of why I can't say definitively what direction this product will go in is that we are [moving] to gather all the information that we can.<\/blockquote>"}};

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

The Administrative Conference of the United States is concerned with how federal agencies carry out laws. Now it’s launched a study to examine the laws governing disclosure of agency legal activities. And whether there’s a way to streamline legal disclosure. It’s an effort that potentially affects every single agency. For more, the Federal Drive with Tom Temin turned to one of the study group members of The Ohio State University law professor Margaret Kwoka.

Interview transcript:

Tom Temin: So this study group is looking at things to recommend to Congress to do what exactly?

Margaret Kwoka: As it stands right now, there are a variety of statutes that mandate that agencies publish certain types of materials on their websites, or make them affirmatively available. Some of them have to be published in the Federal Register. And so we’re looking at sort of a variety of statutes that impact the question of what agencies must disclose on a proactive basis. And that includes FOIA, of course, the Freedom of Information Act, it includes the E-Government Act, it includes the Federal Register Act, among others, and certainly there are substance-specific areas as well. And each of them can impact an agency’s obligations. But we’re looking specifically at sort of a subset of records that those statutes affect. And by agency, who were sort of looking at agency legal materials, and the guiding principle for us is that, we have a principle against secret law, that agency law should be public, that the public should know what agencies do that set forth, binding rights and obligations of those that are subject to the agency’s authority, as well as other kinds of documents that constrain agency action or explain agency’s decision-making guidance for stakeholders in the general public. And so the idea is to identify key statutory changes that Congress should consider to kind of clarify, potentially strengthen in some areas, and harmonize some areas where statutes are either unclear or overlapping, or sometimes even in some amount of tension with another.

Tom Temin: Well, what is the problem that you’re seeking to fix with disclosure laws and to some extent, I guess, disclosure procedures that derive from those laws? What’s the problem?

Margaret Kwoka: Well, there’s sort of a variety of types of problems that we may well be taking up, though, I will say we’re pretty early in the process. We have a public request for information and comment out already. We also, we’re convening sort of a consultative body that will be giving us input as well. And so it may be premature to give you a definitive list, but I can give you some examples of the kinds of things that we’re looking at. So some of them sound a little bit like technicalities. We have some statutes that refer to each other and do so in a way that makes agencies’ obligations frankly, confusing. And as a result, we may be able to recommend some changes that would clean up those obligations and make it clear both for agencies and the public. There are areas where a statute mandates disclosure of a certain type of material, but the definitions are very unclear. And in lots of those areas, there isn’t a lot of judicial interpretation. And agencies are sort of left to their own devices to come up with what those mean, or might follow DOJ guidance. So there are areas where we think there is room to strengthen the obligations so that more agency legal materials are readily available to the public in a way that the public will be able to access them. So this is an effort to look at the issue very comprehensively, both in terms of how to make the statutes work well with each other, and also how to think about this sort of obligations as one set of obligations instead of the sort of, more or less piecemeal approach that has been taken with various statutory changes.

Tom Temin: We’re speaking with Margaret Kwoka, she teaches law at Ohio State University, and she’s part of a study group convened by the Administrative Conference of the United States. And when you say legal materials, this is more than just case law or materials related to lawsuits, by or against the government?

Margaret Kwoka: Absolutely, in fact, primarily not those sorts of materials, because of course, those materials would already be typically available in a public forum, because the court system would have a filing system that would be accessible. So for agencies in litigation, most of those records are already available. We are looking at the question of whether we might be including recommendations about agency settlements in litigation, which are not always public. But that sort of one very small subset of the issues. I would say more than that, you know, we’re looking at, in particular, rules, I think are already clearly defined in the statutory disclosure requirements. But we are looking at whether all those requirements still are adequate in terms of where the materials need to be published, for example, or how they should be published so that the public can best access them. But more than that, guidance documents, what needs to be put on a website versus what should be published in the Federal Register, and also the scope of what guidance documents really can vary widely from agency to agency, but that could include things like manuals or policies for inspectors or enforcement options, penalties, etc. We’re also looking at adjudication materials, so as it stands right now, the prevailing interpretation of FOIA’s requirement that agencies publish orders made in the adjudication of cases is that that would only really be applicable to decisions that the agency considers precedential and binding on future decision making. We’re looking at the question of whether there may be other broader sets of those kinds of decision-making that should be in fact, affirmatively published instead of –

Tom Temin: Sure, because what’s routine to one person might be precedential to someone else from the external side?

Margaret Kwoka: That’s right. And I think sometimes, we’ve seen instances where agencies may be treating certain issue commonly throughout a volume of decisions where no single one of them has been designated as residential or binding. But that’s the agency practice. And there may not be other records out there that really disclose to the public how that issue is being treated. Is there a way to define this set of records better, in a substantive way, as well?

Tom Temin: And would this, maybe somehow, when this is all resolved, would it speed up the FOIA process by taking away some of the discretion that FOIA officers spend so much time trying to exercise?

Margaret Kwoka: That is one of our hopes is to discuss sort of the costs and benefits of any of the recommendations that we do eventually make, including, and we are looking at examples then and some of the empirical work that we can point to, showing that proactive disclosure of categories of records, in some instances, not in every instance, but in some instances, can and that cut back the number of individual FOIA requests an agency has to process, because those records are already being made available. And so I think there is a potential administrative savings, both dollar savings, but also simply the kind of bureaucratic processing of the bogging down of that other set of obligations as well.

Tom Temin: And what is the timeline here? Your study group will then make recommendations to ACUS, to the Administrative Conference [of the United States], then they would be the ones to actually make them directly to Congress?

Margaret Kwoka: That’s correct. The consultant group is looking at trying to get as much public input as we can, input from our consultative group of experts from agency reps and also from public members. But also, you know, we’re trying to get that done maybe over the summer, and then looking at using the fall to get through the ACUS committee process and or ACUS to consider these recommendations for them to potentially adopt.

Tom Temin: And anyone can comment, not just the FOIA officers, but people that might just be believers in open government that follow these things.

Margaret Kwoka: Absolutely. And we’ve been trying to publish that notice, request for comments very broadly, and get it out to as many of the stakeholders as we can directly but certainly members of the public can, and we encourage them to comment. We are looking to push out this request to all stakeholders and as widely as possible. This input is a really important part of the process and it’s a large part of why I can’t say definitively what direction this product will go in is that we are [moving] to gather all the information that we can.

]]>
https://federalnewsnetwork.com/open-datatransparency/2022/06/study-looking-at-how-agencies-make-their-crucial-legal-documents-publicly-available/feed/ 0
AI contracts have special licensing needs to prevent bias, encourage transparency https://federalnewsnetwork.com/artificial-intelligence/2022/05/ai-contracts-have-special-licensing-needs-to-prevent-bias-encourage-transparency/ https://federalnewsnetwork.com/artificial-intelligence/2022/05/ai-contracts-have-special-licensing-needs-to-prevent-bias-encourage-transparency/#respond Wed, 18 May 2022 15:32:14 +0000 https://federalnewsnetwork.com/?p=4064776 var config_4064521 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/dts.podtrac.com\/redirect.mp3\/pdst.fm\/e\/chrt.fm\/track\/E2G895\/aw.noxsolutions.com\/launchpod\/federal-drive\/mp3\/051822_McMartin_web_c0er_21c68030.mp3?awCollectionId=1146&awEpisodeId=eb89641d-bc9b-433a-84dc-f5df21c68030&awNetwork=322"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2018\/12\/FD1500-150x150.jpg","title":"AI contracts have special licensing needs to prevent bias, encourage transparency","description":"[hbidcpodcast podcastid='4064521']nn<em>Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive\u2019s daily audio interviews on\u00a0<\/em><a href="https:\/\/itunes.apple.com\/us\/podcast\/federal-drive-with-tom-temin\/id1270799277?mt=2"><em>Apple Podcast<\/em>s<\/a><em>\u00a0or\u00a0<a href="https:\/\/www.podcastone.com\/federal-drive-with-tom-temin?pid=1753589">PodcastOne<\/a>.<\/em>nnArtificial intelligence software isn't like other software, especially when it comes to acquiring it and licensing it. The data requirements to ensure lack of bias in AI and transparency and how it works are not part of the standard license agreements. This is all the subject of <a href="https:\/\/business.gmu.edu\/news\/2022-04\/no-10-implementing-responsible-ai-proposed-framework-data-licensing">a study by the School of Business at George Mason University<\/a>.\u00a0 Study author and senior fellow Benjamin McMartin joined the\u00a0<a href="https:\/\/federalnewsnetwork.com\/category\/temin\/tom-temin-federal-drive\/"><strong><em>Federal Drive with Tom Temin<\/em><\/strong><\/a>\u00a0 to discuss some of the warnings.nn<em>Interview transcript:<\/em>n<blockquote><strong>Tom Temin:<\/strong> Mr. McMartin, good to have you on.nn<strong>Benjamin McMartin:\u00a0<\/strong>Hey, Tom, great to be on.nn<strong>Tom Temin:<\/strong> So you looked at contracting for artificial intelligence and what are the big differences? It's just software, but maybe not?nn<strong>Benjamin McMartin:\u00a0<\/strong>So at its core, yeah, it's software and should be easy enough. But there are elements of AI that are particular and actually create some challenges within the acquisition environment. The Center for Government Contracting at George Mason is well as my co-author Maj. Andy Bowne who is the chief counsel of the MIT AI Accelerator in the Air Force, really looked at some of the current challenges that DoD is having in procuring AI software technologies, particularly when it comes to licensing.nn<strong>Tom Temin:<\/strong> And what are some of the challenges?nn<strong>Benjamin McMartin:\u00a0<\/strong>The department and honestly, the federal government are looking at the issue of responsible AI. So how do we look at AI technologies and identify whether there are inherent biases, whether we're able to explain the results that we get, and so while you may not be able to explain why Spotify has recommended certain songs for you, or why Tinder has sent you on a certain date, in the Department of Defense, we must be able to identify and explain the results that we get from AI software. The results and the impact of the results are much more dire. And so those are the type of issues that we looked at on this paper, which is how do we develop licensing schemes within the current constructs that allows the department to get the type of information that you need to actually explain the results that you get from artificial intelligence?nn<strong>Tom Temin:<\/strong> Well, isn't that just embodied in the logic of the AI just as any outcome with software is embodied in its logic?nn<strong>Benjamin McMartin:\u00a0<\/strong>So you may be able to get results out of your AI and understand that, hey, I got results based on some algorithm. The question for the department is, can you actually have access to that? Most of these technologies are not being developed within the department. They're being developed in private industry at very high private expense. And so these are big, upfront investments that companies are making. The department traditionally has looked for licensing rights and technologies that allow them to do a few things and these are no surprise, right? What do I want to do with data rights? I want to make sure I don't get locked into a vendor, I want to make sure that I have the data that I need to do test and evaluation and sustain systems for a long, long time. But even that level of data rights does not give me the access I need to explain what was the background data? How was this developed? Why am I getting the results that I'm getting based on the background data? These are traditionally not things that are developed and delivered under a traditional DFARS - Defense Federal Acquisition Regulation - supplement data rights license scheme.nn<strong>Tom Temin:<\/strong> Got it. We're speaking with Benjamin McMartin, he's a senior fellow at the George Mason University School of Business' Center for Government Contracting, and an attorney we should also point out. So what can be done? What is the Air Force and the Navy and the Army that are all pursuing this? What can they do?nn<strong>Benjamin McMartin:\u00a0<\/strong>The purpose of the study that we did, again, in partnership with George Mason and the MIT AI accelerator with the Air Force was to create a framework, a practical framework for how acquisition professionals across DoD, and honestly across the federal government, can look at licensing that does two things: One, it gives access to the type of background data that you would need to understand the results that you're getting from AI solutions. But two, it gives the opportunity to balance. And this is an issue that we kept at the forefront of our paper is, the more data and background data that you ask for from industry, the higher likelihood that folks are not going to want to work with you. And so you have to over communicate what you're using this data for, what the limits of the use of the data are for, and how those custom licensing structures are going to work. This is a challenge. This is a communication challenge to be able to say to accompany, "I'm going to need your background data. I understand in your commercial practice you don't give that to anybody, it's not part of your business model. For DoD's uses we're going to need to look at it, but we're going to procure a license to it, it'll be limited, and you'll understand exactly what we can and can't do with it." And so in our paper, we've provided that framework going through all of the DoD is responsible AI principles, which honestly were developed out of national policy and promoted by the Joint Artificial Intelligence Center. And they've done a great job of identifying what those principles are.nn<strong>Tom Temin:<\/strong> Yeah, so the government is highly aware of this limitation in current licensing. Is there anything in the FAR or the DFAR that can enable this type of licensing request in the first place? Do we need a DFAR update?nn<strong>Benjamin McMartin:\u00a0<\/strong>So the nice part about the DFAR is contrary to what a lot of people might say, it's pretty flexible. It's got the opportunity, and in fact, it encourages the development and negotiation of specially negotiated license rates. Now, there are some limits. But for example, the joint Artificial Intelligence Center through the Tradewind [other transcation agreement] is finding a lot of success in going outside of DFARS and drafting these custom licensing agreements that are pretty close to what you could get with DFARS but there are some nuance. But within the DFARS licensing scheme, our framework that we're proposing through our study in our white paper provides you examples of how you can achieve this within the current framework and the DFARS, or under OTAs, which gives you even more flexibility. But ultimately, there are going to be some issues that are going to come up in the future. And we expect these will be the subject matter of future white papers, which is ultimately through machine learning. There is a point where the machine is developing the data. And the current DFAR scheme is based on who has developed the data and who has funded the data. There becomes a point in machine learning where the machine has developed the data. And the current scheme has not been developed to understand how that will work.nn<strong>Tom Temin:<\/strong> And what about the source code? Because that could be also something required to have full transparency and the audit capability that DoD wants in AI software, can that be part of this mix also?nn<strong>Benjamin McMartin:\u00a0<\/strong>Absolutely. So source code, especially when it comes to machine learning models and artificial intelligence is key to understanding how the algorithms have developed, how they've modified, how they've learned. And ultimately, you need to know what the input data is, and the source code is to understand the outputs that you're getting. The scheme that we're proposing, however, through our white paper, is that those should be special licenses put aside, there shouldn't be a one-license-fits-all for these type of acquisitions. You should sit down and say, okay, source code, this is super important for us for a couple of purposes. And for a limited amount of time, we are going to negotiate a very narrow, very specific license for that piece of it. And then for other stuff, there'll be larger licenses. Ultimately, companies wants to sell to the Department of Defense. But they want to make sure that they maintain their competitive advantage on the commercial market, and honestly, they want to make sure that they remain a preferred customer for federal agencies as well. And so you really have to get in the weeds on each type of data or software and negotiate those as custom license agreements.nn<strong>Tom Temin:<\/strong> So the issue then is not what's in the FAR, the DFAR or the law or regulations. It's simply a matter of trust, and being able to craft very detailed, one-off or bespoke contract licensing agreements as you adopt AI.nn<strong>Benjamin McMartin:\u00a0<\/strong>AI suffers the same challenges as a lot of federal acquisition. And it's communication. Ultimately, the policy of the regulation is to only negotiate the license rights you need for the purposes you need them for, that policy has never changed. There's a couple of issues there. One is communication and this need to inform industry that these are the purposes that we're going to use this for. There are the flexibilities in the law to allow us to do this, the policy demands it. And honestly, the policy benefits both industry and government for that. The second piece, Tom, to that is education. And I am encouraged by congressional actions in the last two [National Defense Authorization Acts] to promote and find money for AI literacy among the acquisition workforce, which is needed because these are not things that folks are going to find on a template. You actually have to sit down and develop these agreements and understand the technology at least to a degree where you can competently advise on license terms.nn<strong>Tom Temin:<\/strong> Attorney Benjamin McMartin is a senior fellow with the George Mason University School of Business' Center for Government Contracting. Thanks so much for joining me.nn<strong>Benjamin McMartin: <\/strong>Thank you very much, Tom.<\/blockquote>"}};

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

Artificial intelligence software isn’t like other software, especially when it comes to acquiring it and licensing it. The data requirements to ensure lack of bias in AI and transparency and how it works are not part of the standard license agreements. This is all the subject of a study by the School of Business at George Mason University.  Study author and senior fellow Benjamin McMartin joined the Federal Drive with Tom Temin  to discuss some of the warnings.

Interview transcript:

Tom Temin: Mr. McMartin, good to have you on.

Benjamin McMartin: Hey, Tom, great to be on.

Tom Temin: So you looked at contracting for artificial intelligence and what are the big differences? It’s just software, but maybe not?

Benjamin McMartin: So at its core, yeah, it’s software and should be easy enough. But there are elements of AI that are particular and actually create some challenges within the acquisition environment. The Center for Government Contracting at George Mason is well as my co-author Maj. Andy Bowne who is the chief counsel of the MIT AI Accelerator in the Air Force, really looked at some of the current challenges that DoD is having in procuring AI software technologies, particularly when it comes to licensing.

Tom Temin: And what are some of the challenges?

Benjamin McMartin: The department and honestly, the federal government are looking at the issue of responsible AI. So how do we look at AI technologies and identify whether there are inherent biases, whether we’re able to explain the results that we get, and so while you may not be able to explain why Spotify has recommended certain songs for you, or why Tinder has sent you on a certain date, in the Department of Defense, we must be able to identify and explain the results that we get from AI software. The results and the impact of the results are much more dire. And so those are the type of issues that we looked at on this paper, which is how do we develop licensing schemes within the current constructs that allows the department to get the type of information that you need to actually explain the results that you get from artificial intelligence?

Tom Temin: Well, isn’t that just embodied in the logic of the AI just as any outcome with software is embodied in its logic?

Benjamin McMartin: So you may be able to get results out of your AI and understand that, hey, I got results based on some algorithm. The question for the department is, can you actually have access to that? Most of these technologies are not being developed within the department. They’re being developed in private industry at very high private expense. And so these are big, upfront investments that companies are making. The department traditionally has looked for licensing rights and technologies that allow them to do a few things and these are no surprise, right? What do I want to do with data rights? I want to make sure I don’t get locked into a vendor, I want to make sure that I have the data that I need to do test and evaluation and sustain systems for a long, long time. But even that level of data rights does not give me the access I need to explain what was the background data? How was this developed? Why am I getting the results that I’m getting based on the background data? These are traditionally not things that are developed and delivered under a traditional DFARS – Defense Federal Acquisition Regulation – supplement data rights license scheme.

Tom Temin: Got it. We’re speaking with Benjamin McMartin, he’s a senior fellow at the George Mason University School of Business’ Center for Government Contracting, and an attorney we should also point out. So what can be done? What is the Air Force and the Navy and the Army that are all pursuing this? What can they do?

Benjamin McMartin: The purpose of the study that we did, again, in partnership with George Mason and the MIT AI accelerator with the Air Force was to create a framework, a practical framework for how acquisition professionals across DoD, and honestly across the federal government, can look at licensing that does two things: One, it gives access to the type of background data that you would need to understand the results that you’re getting from AI solutions. But two, it gives the opportunity to balance. And this is an issue that we kept at the forefront of our paper is, the more data and background data that you ask for from industry, the higher likelihood that folks are not going to want to work with you. And so you have to over communicate what you’re using this data for, what the limits of the use of the data are for, and how those custom licensing structures are going to work. This is a challenge. This is a communication challenge to be able to say to accompany, “I’m going to need your background data. I understand in your commercial practice you don’t give that to anybody, it’s not part of your business model. For DoD’s uses we’re going to need to look at it, but we’re going to procure a license to it, it’ll be limited, and you’ll understand exactly what we can and can’t do with it.” And so in our paper, we’ve provided that framework going through all of the DoD is responsible AI principles, which honestly were developed out of national policy and promoted by the Joint Artificial Intelligence Center. And they’ve done a great job of identifying what those principles are.

Tom Temin: Yeah, so the government is highly aware of this limitation in current licensing. Is there anything in the FAR or the DFAR that can enable this type of licensing request in the first place? Do we need a DFAR update?

Benjamin McMartin: So the nice part about the DFAR is contrary to what a lot of people might say, it’s pretty flexible. It’s got the opportunity, and in fact, it encourages the development and negotiation of specially negotiated license rates. Now, there are some limits. But for example, the joint Artificial Intelligence Center through the Tradewind [other transcation agreement] is finding a lot of success in going outside of DFARS and drafting these custom licensing agreements that are pretty close to what you could get with DFARS but there are some nuance. But within the DFARS licensing scheme, our framework that we’re proposing through our study in our white paper provides you examples of how you can achieve this within the current framework and the DFARS, or under OTAs, which gives you even more flexibility. But ultimately, there are going to be some issues that are going to come up in the future. And we expect these will be the subject matter of future white papers, which is ultimately through machine learning. There is a point where the machine is developing the data. And the current DFAR scheme is based on who has developed the data and who has funded the data. There becomes a point in machine learning where the machine has developed the data. And the current scheme has not been developed to understand how that will work.

Tom Temin: And what about the source code? Because that could be also something required to have full transparency and the audit capability that DoD wants in AI software, can that be part of this mix also?

Benjamin McMartin: Absolutely. So source code, especially when it comes to machine learning models and artificial intelligence is key to understanding how the algorithms have developed, how they’ve modified, how they’ve learned. And ultimately, you need to know what the input data is, and the source code is to understand the outputs that you’re getting. The scheme that we’re proposing, however, through our white paper, is that those should be special licenses put aside, there shouldn’t be a one-license-fits-all for these type of acquisitions. You should sit down and say, okay, source code, this is super important for us for a couple of purposes. And for a limited amount of time, we are going to negotiate a very narrow, very specific license for that piece of it. And then for other stuff, there’ll be larger licenses. Ultimately, companies wants to sell to the Department of Defense. But they want to make sure that they maintain their competitive advantage on the commercial market, and honestly, they want to make sure that they remain a preferred customer for federal agencies as well. And so you really have to get in the weeds on each type of data or software and negotiate those as custom license agreements.

Tom Temin: So the issue then is not what’s in the FAR, the DFAR or the law or regulations. It’s simply a matter of trust, and being able to craft very detailed, one-off or bespoke contract licensing agreements as you adopt AI.

Benjamin McMartin: AI suffers the same challenges as a lot of federal acquisition. And it’s communication. Ultimately, the policy of the regulation is to only negotiate the license rights you need for the purposes you need them for, that policy has never changed. There’s a couple of issues there. One is communication and this need to inform industry that these are the purposes that we’re going to use this for. There are the flexibilities in the law to allow us to do this, the policy demands it. And honestly, the policy benefits both industry and government for that. The second piece, Tom, to that is education. And I am encouraged by congressional actions in the last two [National Defense Authorization Acts] to promote and find money for AI literacy among the acquisition workforce, which is needed because these are not things that folks are going to find on a template. You actually have to sit down and develop these agreements and understand the technology at least to a degree where you can competently advise on license terms.

Tom Temin: Attorney Benjamin McMartin is a senior fellow with the George Mason University School of Business’ Center for Government Contracting. Thanks so much for joining me.

Benjamin McMartin: Thank you very much, Tom.

]]>
https://federalnewsnetwork.com/artificial-intelligence/2022/05/ai-contracts-have-special-licensing-needs-to-prevent-bias-encourage-transparency/feed/ 0
Postal Service agrees to provide more transparency during election season https://federalnewsnetwork.com/federal-newscast/2022/05/postal-service-agrees-to-provide-more-transparency-during-election-season/ https://federalnewsnetwork.com/federal-newscast/2022/05/postal-service-agrees-to-provide-more-transparency-during-election-season/#respond Mon, 16 May 2022 16:11:25 +0000 https://federalnewsnetwork.com/?p=4060523 var config_4060556 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/dts.podtrac.com\/redirect.mp3\/pdst.fm\/e\/chrt.fm\/track\/E2G895\/aw.noxsolutions.com\/launchpod\/FederalNewscast\/mp3\/051622_CASTFORWEB_f72g_98f6939a.mp3?awCollectionId=1102&awEpisodeId=8f17a521-9409-4ff9-a4f1-29d398f6939a&awNetwork=322"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2018\/12\/FedNewscast1500-150x150.jpg","title":"Postal Service agrees to provide more transparency during election season","description":"[hbidcpodcast podcastid='4060556']nn<em>To listen to the Federal Newscast on your phone or mobile device, subscribe in\u00a0<a href="https:\/\/www.podcastone.com\/federal-newstalk?showAllEpisodes=true">PodcastOne<\/a>\u00a0or\u00a0<a href="https:\/\/itunes.apple.com\/us\/podcast\/federal-newscast\/id1053077930?mt=2">Apple Podcasts<\/a>. The best listening experience on desktop can be found using Chrome, Firefox or Safari.<\/em>n<ul>n \t<li>The Postal Service settled another lawsuit stemming from the 2020 presidential election. USPS agreed to publish election mail and mail-in ballot guidance on its website ahead of primary and general elections through 2028. USPS also agreed to meet with states' attorneys general to review its election mail performance and provide weekly on-time delivery data. Pennsylvania led five other states and the District of Columbia in the lawsuit. The settlement is similar to how USPS resolved another lawsuit led by the NAACP and Public Citizen. (<a href="https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2022\/05\/190-1.pdf" target="_blank" rel="noopener"><em>Federal News Network<\/em><\/a>)<\/li>n \t<li>Federal law enforcement officers may see changes to their retirement savings over split annuities. Currently, basic annuity and annuity supplements are calculated the same way for federal officers. That lets former spouses of divorced officers receive a portion of their annuity supplement. Two senators on the Homeland Security and Governmental Affairs Committee are looking to clarify the rules. <a href="https:\/\/www.lankford.senate.gov\/news\/press-releases\/ahead-of-police-week-lankford-sinema-want-to-bring-clarity-to-federal-retirement-pension-for-federal-law-enforcement" target="_blank" rel="noopener">James Lankford (R-Okla.) and Kyrsten Sinema (D-Ariz.)<\/a> have reintroduced a bill that would require a court order or similar process for former spouses to actually receive those payments.<\/li>n<\/ul>n<ul>n \t<li>Lawmakers are pressing the Department of Homeland Security's inspector general on reports that he suppressed findings of domestic abuse and sexual harassment by DHS employees. House lawmakers are asking Inspector General Joseph Cuffari\u2019s office for documents associated with an unpublished report on sexual harassment at DHS law enforcement components. <a href="https:\/\/oversight.house.gov\/sites\/democrats.oversight.house.gov\/files\/2022-05-10.CBM%20Thompson%20to%20Cuffari-DHS%20IG%20re%20Reports.pdf" target="_blank" rel="noopener">House Oversight Committee Chairwoman Carolyn Maloney (D-N.Y.) and Homeland Security Committee Chairman Bennie Thompson (D-Miss.)<\/a> said they\u2019re troubled by reports that Cuffari\u2019s office sought to remove findings from the draft report, as well as another already published report on domestic abuse. The lawmakers want answers from the IG by May 24.<\/li>n<\/ul>n<ul>n \t<li>The Environmental Protection Agency is delaying plans to close a lab in Houston after shuttering other facilities. The EPA tells employees it\u2019s pushing back plans to relocate staff in Houston to another facility about 400 miles away in Ada, Oklahoma. The agency, at first, planned to relocate employees no later than 2023, but is now pushing those plans back to 2027. The relocation would affect about 30 EPA lab employees and 11 contractor employees. The EPA has closed other regional labs over the past few years, including in Richmond, California, Grosse Ile, Michigan and Las Vegas. (<a href="https:\/\/federalnewsnetwork.com\/workforce\/2022\/05\/epa-delays-plans-to-close-houston-lab-relocate-employees-to-2027\/" target="_blank" rel="noopener"><em>Federal News Network<\/em><\/a>)<\/li>n<\/ul>n<ul>n \t<li>The Defense Department may be rethinking how it classifies space programs. Since the inception of the Space Force, some military officials have complained that too much classification has been an issue for connecting with new businesses. DoD\u2019s top space official said the Pentagon is now considering changing how it classifies space programs and working with Congress if needed.<\/li>n<\/ul>n<ul>n \t<li>The Navy said its fleet is too small for its mission, but is the mission reasonable? The chief of naval operations said the service\u2019s fleet is too small to handle two simultaneous major conflicts at once. The Navy currently has about 300 ships. Adm. Michael Gilday proposed expanding the Navy to 500 ships. However, many lawmakers questions whether the United Stated needs to be prepared for a two-front war given the current landscape. The Defense Department is currently revamping its National Defense Strategy, which could lower the bar for what the military needs to be responsible for.<\/li>n<\/ul>n<ul>n \t<li>At least a handful of military academy students are facing serious financial ramifications over their refusal to get vaccinated against COVID-19. The Air Force Academy said it has four cadets who refused vaccines, because of which they would be blocked from graduation and becoming commissioned officers. But a spokesman said their status could change between now and graduation day as they \u201cweigh the consequences.\u201d Air Force officials said the cadets could also be required to repay hundreds of dollars in tuition costs. The final decision would be up to the Secretary of the Air Force. Officials at the Army and Navy service academies said they have no graduating students who refused COVID vaccines. (<a href="https:\/\/federalnewsnetwork.com\/government-news\/2022\/05\/4-air-force-cadets-may-not-graduate-due-to-vaccine-refusal" target="_blank" rel="noopener"><em>Federal News Network<\/em><\/a>)<\/li>n<\/ul>n<ul>n \t<li>Six weeks after pausing the $50 billion small business contract known as Polaris, the General Services Administration is out with suggested changes. Small businesses have a chance to weigh in on the changes GSA wants to make to the Polaris solicitation. GSA released draft updated language Friday, as part of its decision to rethink the requirements for mentor-prot\u00e9g\u00e9 and joint venture bidders. GSA, which worked with the Small SBA on the draft changes, is asking prot\u00e9g\u00e9\u2019s to provide a minimum of one primary relevant experience project or emerging technology relevant experience project. The mentor can provide no more than three primary relevant experience projects. The original RFP didn't require the prot\u00e9g\u00e9 to provide any examples of relevant experience. Vendors have until May 23 to provide comments. (<a href="https:\/\/federalnewsnetwork.com\/contractsawards\/2022\/05\/gsa-is-out-with-suggested-corrections-to-50b-polaris-rfp\/" target="_blank" rel="noopener"><em>Federal News Network<\/em><\/a>)<\/li>n<\/ul>n<ul>n \t<li>Agencies spent $420 billion in fiscal 2021 on common goods and services that fall under the governmentwide category management initiative. That was $20 billion more than in 2020 despite fewer awards and contracts. The <a href="https:\/\/files.gao.gov\/reports\/GAO-22-105301\/index.html#d438e102a2334_1651520617120" target="_blank" rel="noopener">Government Accountability Office<\/a> said category management is one of several acquisition areas where agencies could make more progress to reduce costs and duplication. GAO said in its bi-annual report on duplication across government that the Office of Management and Budget needs to address agencies\u2019 data management challenges and establish additional performance metrics to help achieve more cost savings, as well as potentially eliminate duplicative contracts.<\/li>n<\/ul>n<ul>n \t<li>Agencies and industry are charting a path forward to secure open source software. The <a href="https:\/\/www.linuxfoundation.org\/press-release\/linux-foundation-openssf-gather-industry-government-leaders-open-source-software-security-summit\/" target="_blank" rel="noopener">Linux Foundation and the Open Source Software Security Foundation<\/a> said the plan is backed by about $150 million in industry funding over the next two years. Goals include increasing the security of open source production, improving vulnerability discovery and remediation, and speeding up the time it takes to patch systems. The White House, the Cyber Security and Infrastructure Security Agency, and other agencies helped develop the plan.<\/li>n<\/ul>n<ul>n \t<li>A new appointee is heading to the <a href="https:\/\/www.mspb.gov\/publicaffairs\/press_releases\/New_Director_of_OPE_Press_Release_1920635.pdf" target="_blank" rel="noopener">Merit System Protection Board<\/a>. Tiffany Lightbourn will join MSPB as director of policy and evaluation at the board's headquarters. In the role, Lightbourn will lead non-partisan studies that evaluate the board's policies and programs guiding federal workforce cases. The Office of Policy and Evaluation at MSPB regularly shares its findings and recommendations with the White House, Congress and other stakeholders. Before joining MSPB, Lightbourn served as director of human resources shared services at the IRS.<\/li>n<\/ul>"}};

To listen to the Federal Newscast on your phone or mobile device, subscribe in PodcastOne or Apple Podcasts. The best listening experience on desktop can be found using Chrome, Firefox or Safari.

  • The Postal Service settled another lawsuit stemming from the 2020 presidential election. USPS agreed to publish election mail and mail-in ballot guidance on its website ahead of primary and general elections through 2028. USPS also agreed to meet with states’ attorneys general to review its election mail performance and provide weekly on-time delivery data. Pennsylvania led five other states and the District of Columbia in the lawsuit. The settlement is similar to how USPS resolved another lawsuit led by the NAACP and Public Citizen. (Federal News Network)
  • Federal law enforcement officers may see changes to their retirement savings over split annuities. Currently, basic annuity and annuity supplements are calculated the same way for federal officers. That lets former spouses of divorced officers receive a portion of their annuity supplement. Two senators on the Homeland Security and Governmental Affairs Committee are looking to clarify the rules. James Lankford (R-Okla.) and Kyrsten Sinema (D-Ariz.) have reintroduced a bill that would require a court order or similar process for former spouses to actually receive those payments.
  • Lawmakers are pressing the Department of Homeland Security’s inspector general on reports that he suppressed findings of domestic abuse and sexual harassment by DHS employees. House lawmakers are asking Inspector General Joseph Cuffari’s office for documents associated with an unpublished report on sexual harassment at DHS law enforcement components. House Oversight Committee Chairwoman Carolyn Maloney (D-N.Y.) and Homeland Security Committee Chairman Bennie Thompson (D-Miss.) said they’re troubled by reports that Cuffari’s office sought to remove findings from the draft report, as well as another already published report on domestic abuse. The lawmakers want answers from the IG by May 24.
  • The Environmental Protection Agency is delaying plans to close a lab in Houston after shuttering other facilities. The EPA tells employees it’s pushing back plans to relocate staff in Houston to another facility about 400 miles away in Ada, Oklahoma. The agency, at first, planned to relocate employees no later than 2023, but is now pushing those plans back to 2027. The relocation would affect about 30 EPA lab employees and 11 contractor employees. The EPA has closed other regional labs over the past few years, including in Richmond, California, Grosse Ile, Michigan and Las Vegas. (Federal News Network)
  • The Defense Department may be rethinking how it classifies space programs. Since the inception of the Space Force, some military officials have complained that too much classification has been an issue for connecting with new businesses. DoD’s top space official said the Pentagon is now considering changing how it classifies space programs and working with Congress if needed.
  • The Navy said its fleet is too small for its mission, but is the mission reasonable? The chief of naval operations said the service’s fleet is too small to handle two simultaneous major conflicts at once. The Navy currently has about 300 ships. Adm. Michael Gilday proposed expanding the Navy to 500 ships. However, many lawmakers questions whether the United Stated needs to be prepared for a two-front war given the current landscape. The Defense Department is currently revamping its National Defense Strategy, which could lower the bar for what the military needs to be responsible for.
  • At least a handful of military academy students are facing serious financial ramifications over their refusal to get vaccinated against COVID-19. The Air Force Academy said it has four cadets who refused vaccines, because of which they would be blocked from graduation and becoming commissioned officers. But a spokesman said their status could change between now and graduation day as they “weigh the consequences.” Air Force officials said the cadets could also be required to repay hundreds of dollars in tuition costs. The final decision would be up to the Secretary of the Air Force. Officials at the Army and Navy service academies said they have no graduating students who refused COVID vaccines. (Federal News Network)
  • Six weeks after pausing the $50 billion small business contract known as Polaris, the General Services Administration is out with suggested changes. Small businesses have a chance to weigh in on the changes GSA wants to make to the Polaris solicitation. GSA released draft updated language Friday, as part of its decision to rethink the requirements for mentor-protégé and joint venture bidders. GSA, which worked with the Small SBA on the draft changes, is asking protégé’s to provide a minimum of one primary relevant experience project or emerging technology relevant experience project. The mentor can provide no more than three primary relevant experience projects. The original RFP didn’t require the protégé to provide any examples of relevant experience. Vendors have until May 23 to provide comments. (Federal News Network)
  • Agencies spent $420 billion in fiscal 2021 on common goods and services that fall under the governmentwide category management initiative. That was $20 billion more than in 2020 despite fewer awards and contracts. The Government Accountability Office said category management is one of several acquisition areas where agencies could make more progress to reduce costs and duplication. GAO said in its bi-annual report on duplication across government that the Office of Management and Budget needs to address agencies’ data management challenges and establish additional performance metrics to help achieve more cost savings, as well as potentially eliminate duplicative contracts.
  • Agencies and industry are charting a path forward to secure open source software. The Linux Foundation and the Open Source Software Security Foundation said the plan is backed by about $150 million in industry funding over the next two years. Goals include increasing the security of open source production, improving vulnerability discovery and remediation, and speeding up the time it takes to patch systems. The White House, the Cyber Security and Infrastructure Security Agency, and other agencies helped develop the plan.
  • A new appointee is heading to the Merit System Protection Board. Tiffany Lightbourn will join MSPB as director of policy and evaluation at the board’s headquarters. In the role, Lightbourn will lead non-partisan studies that evaluate the board’s policies and programs guiding federal workforce cases. The Office of Policy and Evaluation at MSPB regularly shares its findings and recommendations with the White House, Congress and other stakeholders. Before joining MSPB, Lightbourn served as director of human resources shared services at the IRS.
]]>
https://federalnewsnetwork.com/federal-newscast/2022/05/postal-service-agrees-to-provide-more-transparency-during-election-season/feed/ 0
What good is all of the criminal justice data put out by the Justice Department if no one uses it? https://federalnewsnetwork.com/big-data/2022/03/what-good-is-all-of-the-criminal-justice-data-put-out-by-the-justice-department-if-no-one-uses-it/ https://federalnewsnetwork.com/big-data/2022/03/what-good-is-all-of-the-criminal-justice-data-put-out-by-the-justice-department-if-no-one-uses-it/#respond Fri, 25 Mar 2022 16:10:51 +0000 https://federalnewsnetwork.com/?p=3978375 var config_3978174 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/dts.podtrac.com\/redirect.mp3\/pdst.fm\/e\/chrt.fm\/track\/E2G895\/aw.noxsolutions.com\/launchpod\/federal-drive\/mp3\/032522_Quazilbash_web_551l_6c6a0951.mp3?awCollectionId=1146&awEpisodeId=5d9ca0c0-f557-47fc-9b9e-0c826c6a0951&awNetwork=322"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2018\/12\/FD1500-150x150.jpg","title":"What good is all of the criminal justice data put out by the Justice Department if no one uses it?","description":"[hbidcpodcast podcastid='3978174']nn<em>Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive\u2019s daily audio interviews on\u00a0<\/em><a href="https:\/\/itunes.apple.com\/us\/podcast\/federal-drive-with-tom-temin\/id1270799277?mt=2"><i>Apple Podcasts<\/i><\/a><em>\u00a0or\u00a0<a href="https:\/\/www.podcastone.com\/federal-drive-with-tom-temin?pid=1753589">PodcastOne<\/a>.<\/em>nnEach year, the Justice Department churns out lots of data about criminal justice. Recently it launched an initiative to prod state and local governments to use the data more effectively to make policy and budget decisions. To find out how it works, the\u00a0<a href="https:\/\/federalnewsnetwork.com\/category\/temin\/tom-temin-federal-drive\/"><em><strong>Federal Drive with Tom Temin<\/strong><\/em><\/a> spoke with the acting deputy director for policy at the Bureau of Justice Assistance, Ruby Qazilbash.nn<em>Interview transcript:<\/em>n<blockquote><strong>Tom Temin:\u00a0<\/strong>Ms. Qazilbash, good to have you on.nn<strong>Ruby Qazilbash: <\/strong>Thank you so much.nn<strong>Tom Temin: <\/strong>Let's begin at the beginning. There's a lot of bureaus at the Justice Department and just those that are uninitiated, help sort us out. The Bureau of Justice Assistance, where does that fit into the whole beehive of the Justice Department?nn<strong>Ruby Qazilbash: <\/strong>I like to say that the Bureau of Justice Assistance or BJA is on that branch of the Department of Justice tree that reaches out to states and localities and tribes and our mission is to support and strengthen criminal justice systems within states and localities and tribes. The branch of our tree is named the Office of Justice Programs and on that branch, along with the Bureau of Justice Assistance are the statistical arm of the department, the Bureau of Justice Statistics, the research arm, the National Institute of Justice, and others, too, that focus on victims issues as well as juvenile justice issues. Our primary focus is on the adult criminal justice system.nn<strong>Tom Temin: <\/strong>Right. And the criminal justice system itself, then, comprises both the police and law enforcement end as well as courts and sentencing and that whole process end, correct?nn<strong>Ruby Qazilbash: <\/strong>You got it.nn<strong>Tom Temin: <\/strong>All right, the initiative that you have going, give us a sense of the type of data, first of all, it's even available to states and localities and tribes that you generate, and how does it get generated?nn<strong>Ruby Qazilbash: <\/strong>Well, there are data gaps for sure. While every state is different, though, we consistently find that agencies collect data that would be useful for policymakers and the public, but they don't necessarily release those data, or they release them infrequently. Or they release them in ways that don't lend themselves to drive policymaking in a data driven way. And the Justice Count State Data Scan, something that we did across all 50 states and looking at corrections data, for instance, to see what was out there clearly illustrates a trend. So for example, 38 states report their prison populations at least monthly, but less than half of states report their post release supervision population at the same rate. So while state participation is critical to the success of initiatives like <a href="https:\/\/bja.ojp.gov\/program\/justice-counts\/overview" target="_blank" rel="noopener">Justice Counts<\/a>, we think the federal government is uniquely situated to really catalyze the kind of coordinated substantial effort that's necessary to change the face of criminal justice data.nnSo in Justice Counts, you have the federal government that's providing the infrastructure, the instruction, and some in-state support to make these metrics real, potentially, for every criminal justice agency that you just talked about. So whether that is law enforcement, or prosecutors or defense, jails, the court system, prisons, community supervision, and we think that the federal government has a key role here in helping state and local agencies on these issues related to data. And Justice Counts is an effort to do just that. So again, most agencies already collect lots of data and have systems and policies and protocols for maintaining it and analyzing it, the resources that Justice Counts will provide, which will be the metrics, a technology platform, and technical assistance, and more. What we're hoping to do with that is to help states and localities do more with what they already collect. And then that'll provide policymakers with some key and timely information to support data driven policymaking within the criminal justice arena.nn<strong>Tom Temin: <\/strong>What about those states and localities and tribes that don't want to make it public necessarily, or figure they just gather it for their own purposes, but I guess politics comes into this at different levels in different locales in different ways?nn<strong>Ruby Qazilbash: <\/strong>It sure does. And we are hoping just as any database, the fuller and more complete the data that are available for criminal justice policymakers to get more of a bird's eye view and to fill in as many puzzle pieces as possible, is going to help them drive more data driven policymaking. This is an opt in approach. The Bureau of Justice Assistance is really in the business of making grant funding tools and resources available for states and localities and tribes. And that's what Justice Counts is another example of. So we are bringing together a Bureau of Justice Assistance in partnership with an organization called the Council of State Governments Justice Center, and then 21 different partner organizations representing associations and different groups that represent all those different parts of the criminal justice system that I talked about, that are working together to achieve consensus to identify and then that baseline, what are those kind of duh metrics, so to speak, that we should already be collecting, and would be easy for us there. It's a feasible metric for us to collect and also to share. And so we'll be putting the the metrics out there and making those publicly available, announcing those later this spring, providing just a technology platform for those that want to opt in and share publicly with their peer organizations within their jurisdiction, to bubble up to take a look at the state or potentially to look and compare across jurisdictions. For instance, with a neighboring jurisdiction. So this is not a strong arm approach. This is a resource that we're providing to states and localities and tribes to opt into, in hopes to make information more readily available to people that need it, but also the public.nn<strong>Tom Temin: <\/strong>We're speaking with Ruby Qazilbash. She's the acting deputy director for policy at the Bureau of Justice Assistance at the Justice Department. And do you see this potentially also, this data rolling up to fill in gaps that the federal government collects at the federal, say the FBI level has crime statistics that are national in scope? Could these combined data sets from the locales enhance what's available nationally?nn<strong>Ruby Qazilbash: <\/strong>We have a lot of rigorous data sets are ready, obviously, that the FBI collects through NIBRS, the National Incident Based Reporting System that are intended and built to be aggregated to speak about and be nationally representative. We also have the Bureau of Justice Statistics that engages in statistical collections for various statutory and other purposes. This is not meant to replace any of them, but like I said, is rather a resource for states and localities and tribes to be able to identify core sets of metrics that their policymakers can use to drive decisions. So there's lots of things that they need to know whether that's about populations as they move through the system, operational aspects, but we don't have criminal justice agencies that are collecting the same metrics defined similarly, and aggregating them and looking at trends over the time, this gives them the opportunity to do that.nn<strong>Tom Temin: <\/strong>And do you feel there might be localities out there that would like to get into the data business, knowing that it can result in better decision making and better government, but maybe just don't have the experience or the technology to do so in the first place?nn<strong>Ruby Qazilbash: <\/strong>I sure hope so. Yeah. What these metrics do is really get everyone, so policymakers, state and local criminal justice agency leaders, advocates on the same page about what to look at, what to share, what to consider when making policy decisions. And the metrics are also the basis of agency-focused tools, the public facing platform that will develop and other resources that will develop through Justice Counts. To create them, we really ask ourselves two questions. One, does the metric convey useful information to policymakers? And two, is it feasible? So for each metric of feasibility kind of rating, and we're only including those that exceed that threshold. So is the metric really feasible for most agencies to be able to collect and share? Basically, do most agencies have the data necessary to produce it? And there's a really wide range of potential metrics that could be helpful for research and analysis. But if agencies don't have the data, or can't get it, it won't work. And we found that no two agencies are identical. So being flexible and kind of calibrating our approach towards feasibility was really important to us. And we also realize that we've got to walk before we can run. So we'll be starting with a handful of metrics, tier one metrics, so to speak, and then gradually expanding down the road as agencies really become more accustomed to using the metrics and the associated data platform that we'll be standing up.nn<strong>Tom Temin: <\/strong>Now is there money involved here? That is a situation of a local government wants to get onto that technology platform wants to access the metrics? It's at their expense? If there is an expense, is there grants available? How does it work from that standpoint?nn<strong>Ruby Qazilbash: <\/strong>Right now, later this spring, we will be releasing that tier one or the handful of kind of indispensable metrics that we really hope every justice agency will be able to commit to, it will be possible for them to collect and share at least locally. We have technical assistance that will be available for anyone that wants to adopt the metrics and use the platform and how to use it. Later this year, we hope to release a solicitation that will make funding available to states to help with capacity building at the state level. And that can also drill down and assist local jurisdictions or agencies and local jurisdictions participate as well.nn<strong>Tom Temin: <\/strong>And just to be clear, besides giving them the means to collect the data and store it in an organized way, there's also information tools, whatever available to help localities use the data and create analysis on how they can better do budgeting or better change other policies throughout their criminal justice systems?nn<strong>Ruby Qazilbash: <\/strong>We are still building out the full data platform. And in the future, having some analytical tools built in would be a great thing to include. But I can't say right now that's included in the full build out.nn<strong>Tom Temin: <\/strong>All right. Is there a timeline to this? Or is there a deadline to it? Or is this going to be a kind of perpetual program as different state and local and tribal governments sign on?nn<strong>Ruby Qazilbash: <\/strong>Yeah, well, we'll be announcing the launching the tier one metrics later on this spring. Jurisdictions can look for a solicitation. Other federal agencies call them different things request for proposals or notice of funding availability. At the Department of Justice, we call them solicitations. So states can look for a solicitation that will be released later this funding season, this spring, for funding to support. But if agencies are interested in participating as soon as they're released, technical assistance will be available to them to get support to do that.nn<strong>Tom Temin: <\/strong>Ruby Qazilbash is the acting deputy director for policy at the Bureau of Justice Assistance at the Justice Department. Thanks so much for joining me.nn<strong>Ruby Qazilbash: <\/strong>Thank you.<\/blockquote>"}};

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

Each year, the Justice Department churns out lots of data about criminal justice. Recently it launched an initiative to prod state and local governments to use the data more effectively to make policy and budget decisions. To find out how it works, the Federal Drive with Tom Temin spoke with the acting deputy director for policy at the Bureau of Justice Assistance, Ruby Qazilbash.

Interview transcript:

Tom Temin: Ms. Qazilbash, good to have you on.

Ruby Qazilbash: Thank you so much.

Tom Temin: Let’s begin at the beginning. There’s a lot of bureaus at the Justice Department and just those that are uninitiated, help sort us out. The Bureau of Justice Assistance, where does that fit into the whole beehive of the Justice Department?

Ruby Qazilbash: I like to say that the Bureau of Justice Assistance or BJA is on that branch of the Department of Justice tree that reaches out to states and localities and tribes and our mission is to support and strengthen criminal justice systems within states and localities and tribes. The branch of our tree is named the Office of Justice Programs and on that branch, along with the Bureau of Justice Assistance are the statistical arm of the department, the Bureau of Justice Statistics, the research arm, the National Institute of Justice, and others, too, that focus on victims issues as well as juvenile justice issues. Our primary focus is on the adult criminal justice system.

Tom Temin: Right. And the criminal justice system itself, then, comprises both the police and law enforcement end as well as courts and sentencing and that whole process end, correct?

Ruby Qazilbash: You got it.

Tom Temin: All right, the initiative that you have going, give us a sense of the type of data, first of all, it’s even available to states and localities and tribes that you generate, and how does it get generated?

Ruby Qazilbash: Well, there are data gaps for sure. While every state is different, though, we consistently find that agencies collect data that would be useful for policymakers and the public, but they don’t necessarily release those data, or they release them infrequently. Or they release them in ways that don’t lend themselves to drive policymaking in a data driven way. And the Justice Count State Data Scan, something that we did across all 50 states and looking at corrections data, for instance, to see what was out there clearly illustrates a trend. So for example, 38 states report their prison populations at least monthly, but less than half of states report their post release supervision population at the same rate. So while state participation is critical to the success of initiatives like Justice Counts, we think the federal government is uniquely situated to really catalyze the kind of coordinated substantial effort that’s necessary to change the face of criminal justice data.

So in Justice Counts, you have the federal government that’s providing the infrastructure, the instruction, and some in-state support to make these metrics real, potentially, for every criminal justice agency that you just talked about. So whether that is law enforcement, or prosecutors or defense, jails, the court system, prisons, community supervision, and we think that the federal government has a key role here in helping state and local agencies on these issues related to data. And Justice Counts is an effort to do just that. So again, most agencies already collect lots of data and have systems and policies and protocols for maintaining it and analyzing it, the resources that Justice Counts will provide, which will be the metrics, a technology platform, and technical assistance, and more. What we’re hoping to do with that is to help states and localities do more with what they already collect. And then that’ll provide policymakers with some key and timely information to support data driven policymaking within the criminal justice arena.

Tom Temin: What about those states and localities and tribes that don’t want to make it public necessarily, or figure they just gather it for their own purposes, but I guess politics comes into this at different levels in different locales in different ways?

Ruby Qazilbash: It sure does. And we are hoping just as any database, the fuller and more complete the data that are available for criminal justice policymakers to get more of a bird’s eye view and to fill in as many puzzle pieces as possible, is going to help them drive more data driven policymaking. This is an opt in approach. The Bureau of Justice Assistance is really in the business of making grant funding tools and resources available for states and localities and tribes. And that’s what Justice Counts is another example of. So we are bringing together a Bureau of Justice Assistance in partnership with an organization called the Council of State Governments Justice Center, and then 21 different partner organizations representing associations and different groups that represent all those different parts of the criminal justice system that I talked about, that are working together to achieve consensus to identify and then that baseline, what are those kind of duh metrics, so to speak, that we should already be collecting, and would be easy for us there. It’s a feasible metric for us to collect and also to share. And so we’ll be putting the the metrics out there and making those publicly available, announcing those later this spring, providing just a technology platform for those that want to opt in and share publicly with their peer organizations within their jurisdiction, to bubble up to take a look at the state or potentially to look and compare across jurisdictions. For instance, with a neighboring jurisdiction. So this is not a strong arm approach. This is a resource that we’re providing to states and localities and tribes to opt into, in hopes to make information more readily available to people that need it, but also the public.

Tom Temin: We’re speaking with Ruby Qazilbash. She’s the acting deputy director for policy at the Bureau of Justice Assistance at the Justice Department. And do you see this potentially also, this data rolling up to fill in gaps that the federal government collects at the federal, say the FBI level has crime statistics that are national in scope? Could these combined data sets from the locales enhance what’s available nationally?

Ruby Qazilbash: We have a lot of rigorous data sets are ready, obviously, that the FBI collects through NIBRS, the National Incident Based Reporting System that are intended and built to be aggregated to speak about and be nationally representative. We also have the Bureau of Justice Statistics that engages in statistical collections for various statutory and other purposes. This is not meant to replace any of them, but like I said, is rather a resource for states and localities and tribes to be able to identify core sets of metrics that their policymakers can use to drive decisions. So there’s lots of things that they need to know whether that’s about populations as they move through the system, operational aspects, but we don’t have criminal justice agencies that are collecting the same metrics defined similarly, and aggregating them and looking at trends over the time, this gives them the opportunity to do that.

Tom Temin: And do you feel there might be localities out there that would like to get into the data business, knowing that it can result in better decision making and better government, but maybe just don’t have the experience or the technology to do so in the first place?

Ruby Qazilbash: I sure hope so. Yeah. What these metrics do is really get everyone, so policymakers, state and local criminal justice agency leaders, advocates on the same page about what to look at, what to share, what to consider when making policy decisions. And the metrics are also the basis of agency-focused tools, the public facing platform that will develop and other resources that will develop through Justice Counts. To create them, we really ask ourselves two questions. One, does the metric convey useful information to policymakers? And two, is it feasible? So for each metric of feasibility kind of rating, and we’re only including those that exceed that threshold. So is the metric really feasible for most agencies to be able to collect and share? Basically, do most agencies have the data necessary to produce it? And there’s a really wide range of potential metrics that could be helpful for research and analysis. But if agencies don’t have the data, or can’t get it, it won’t work. And we found that no two agencies are identical. So being flexible and kind of calibrating our approach towards feasibility was really important to us. And we also realize that we’ve got to walk before we can run. So we’ll be starting with a handful of metrics, tier one metrics, so to speak, and then gradually expanding down the road as agencies really become more accustomed to using the metrics and the associated data platform that we’ll be standing up.

Tom Temin: Now is there money involved here? That is a situation of a local government wants to get onto that technology platform wants to access the metrics? It’s at their expense? If there is an expense, is there grants available? How does it work from that standpoint?

Ruby Qazilbash: Right now, later this spring, we will be releasing that tier one or the handful of kind of indispensable metrics that we really hope every justice agency will be able to commit to, it will be possible for them to collect and share at least locally. We have technical assistance that will be available for anyone that wants to adopt the metrics and use the platform and how to use it. Later this year, we hope to release a solicitation that will make funding available to states to help with capacity building at the state level. And that can also drill down and assist local jurisdictions or agencies and local jurisdictions participate as well.

Tom Temin: And just to be clear, besides giving them the means to collect the data and store it in an organized way, there’s also information tools, whatever available to help localities use the data and create analysis on how they can better do budgeting or better change other policies throughout their criminal justice systems?

Ruby Qazilbash: We are still building out the full data platform. And in the future, having some analytical tools built in would be a great thing to include. But I can’t say right now that’s included in the full build out.

Tom Temin: All right. Is there a timeline to this? Or is there a deadline to it? Or is this going to be a kind of perpetual program as different state and local and tribal governments sign on?

Ruby Qazilbash: Yeah, well, we’ll be announcing the launching the tier one metrics later on this spring. Jurisdictions can look for a solicitation. Other federal agencies call them different things request for proposals or notice of funding availability. At the Department of Justice, we call them solicitations. So states can look for a solicitation that will be released later this funding season, this spring, for funding to support. But if agencies are interested in participating as soon as they’re released, technical assistance will be available to them to get support to do that.

Tom Temin: Ruby Qazilbash is the acting deputy director for policy at the Bureau of Justice Assistance at the Justice Department. Thanks so much for joining me.

Ruby Qazilbash: Thank you.

]]>
https://federalnewsnetwork.com/big-data/2022/03/what-good-is-all-of-the-criminal-justice-data-put-out-by-the-justice-department-if-no-one-uses-it/feed/ 0
A lot of what USAID does relies on this geographer https://federalnewsnetwork.com/open-datatransparency/2022/03/a-lot-of-what-usaid-does-relies-on-this-geographer/ https://federalnewsnetwork.com/open-datatransparency/2022/03/a-lot-of-what-usaid-does-relies-on-this-geographer/#respond Fri, 25 Mar 2022 15:41:06 +0000 https://federalnewsnetwork.com/?p=3978288 var config_3978175 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/dts.podtrac.com\/redirect.mp3\/pdst.fm\/e\/chrt.fm\/track\/E2G895\/aw.noxsolutions.com\/launchpod\/federal-drive\/mp3\/032522_Stokes_web_l27h_8d7cefeb.mp3?awCollectionId=1146&awEpisodeId=ab96082a-422d-46da-b915-e31a8d7cefeb&awNetwork=322"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2018\/12\/FD1500-150x150.jpg","title":"A lot of what USAID does relies on this geographer","description":"[hbidcpodcast podcastid='3978175']nn<em>Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive\u2019s daily audio interviews on\u00a0<\/em><a href="https:\/\/itunes.apple.com\/us\/podcast\/federal-drive-with-tom-temin\/id1270799277?mt=2"><em>Apple Podcast<\/em>s<\/a><em>\u00a0or\u00a0<a href="https:\/\/www.podcastone.com\/federal-drive-with-tom-temin?pid=1753589">PodcastOne<\/a>.<\/em>nnGeographic information is crucial to agency missions almost everywhere. And it's especially true for the U.S. Agency for International Development. It operates throughout the world and for how geo and geo information systems underlie development decisions, the\u00a0<a href="https:\/\/federalnewsnetwork.com\/category\/temin\/tom-temin-federal-drive\/"><strong><em>Federal Drive with Tom Temin<\/em><\/strong><\/a> turned to the USAID's chief geographer Carrie Stokes.nn<em>Interview transcript:<\/em>n<blockquote><strong>Tom Temin:<\/strong> Ms. Stokes, good to have you on.nn<strong>Carrie Stokes:<\/strong> Good to be here. Thank you.nn<strong>Tom Temin:<\/strong> So tell us first how geographic information does underlie what USAID does. I mean, you have locations, countries around the world where development work occurs and is overseen and funded by USAID but how does specific geographic information figure into all of this?nn<strong>Carrie Stokes:<\/strong> Well, as you know, USAID works in about 100 countries around the world. And we are focused on improving the lives of people in those countries. We work in many different sectors so everything from health and education, conflict and stabilization, environment and climate change, the list goes on and on with the many sectors we work in. So the geographic approach to development is an important one that I work to promote with a team of geographers and data analysts in the GeoCenter that I lead. And the idea here is to get a full picture about what's happening in the countries where we work, not just in a sector-based silo approach that is typically the way many organizations operate. So what we do with geographic information is focused on where the development need is concentrated. We look at where we are already working as an agency. And we like to understand how effective our programs are. So all of that means we need geographic information, we need to understand the geography of the places where we're working in the scale that we're trying to target, where the communities and the people are that we're working with.nn<strong>Tom Temin:<\/strong> And is your geographic information simply pins in a map? Because the way describe it, it could be just well, there's a poor area here, let's put a pin there. And then you've got a nice push pin map of the country.nn<strong>Carrie Stokes: <\/strong>In a simple form, the short answer would be yes, pins in a map are something that everybody's really familiar with these days because we all use our mobile devices. And it's easily accessible to see digital maps in this modern day that we all live in. But the work that the geographers and data analysts on my team does is a bit more detail than just visualizing where projects are based. We do analytics, and we take data sets that are disparate from many different places. So we'll want to know, of course, where a project may be based, a health project, for example. But we want to overlay that, meaning we want to combine that information with what is the actual status of people's health in that particular community or that particular district of a country. So we will look at things like mortality rates, we'll look at what birth rates are like, we'll look at distance that people have to get to a health clinic, for example, do they have access to basic health services? So we want to look at the spatial extent of the geographic influences that factor into people's daily lives. So yes, there may be a pushpin here or there, but we're also trying to get the context in which people live. So bringing in the different kinds of data sets gives us a fuller picture of what's going on.nn<strong>Tom Temin:<\/strong> Now, in the United States, there is a rich array of geographic information on almost everything conceivable down to the sewer line levels, that's easily obtainable. What about some of the countries that are less developed, maybe not quite as open to society, perhaps as the United States? What are your sources of information, so that they're willing to say yes, this is the death rate or the birth rate, or whatever it is, you want to overlay? How do you get that information?nn<strong>Carrie Stokes: <\/strong>Well, we're sleuths. We're geographers, we're good at looking for information that we know will be representative of people and a place. We look at demographic data. So the kind of data that's collected in a census for example, some countries conduct their own in digital form, and they publish that information. And we will use that if we can't get it directly from the country itself, there are other sources of household survey information that we really value. The World Bank publishes some of these. USAID, for whom I work, has been publishing an incredible demographic and health survey data set for almost 30 years or more. And these are very valuable pieces of information to combine information about people's livelihoods, their health status, and then we also look at the biophysical information on the ground.nnSo we look using satellite imagery sometimes. We will look at high resolution imagery and zoom into a particular area of interest. That helps us better understand what the actual physical geography is like. So when we get to combine physical geography with human geography, as we say, it really allows us to ask deeper questions about what's driving human behavior, what may be some of the factors affecting people's decisions about where they get their food, if there's a food insecurity situation going on. So we're very creative about data. But we've also tapped into the open mapping data movement, as I will call it. There is a platform called OpenStreetMap and we use it. It empowers people who have access to the internet to create data that represents their own local communities geospatial data. And this data is made available to anyone who uses the platform. So we make use of this as well, we contribute to it, and we use it. But half the challenge for us when we're going to analyze a particular area is in fact getting reliable, trustworthy data that we know is representative and timely.nn<strong>Tom Temin:<\/strong> We're speaking with Carrie Stokes, she's chief geographer at the U.S. Agency for International Development. So it sounds like the importance of geo means that you probably help inform what surveys USAID does in the first place, in areas where it's able to do surveys.nn<strong>Carrie Stokes: <\/strong>We do and we feel it's important when users of information have an opportunity to influence what information actually gets collected. So the user base of this information is growing and growing. It's not just me and my team, we have a geospatial Community of Practice as we say, all throughout USAID. We have about 165 people, that's pretty incredible given that we're not a mapping agency, we're a development agency. But the power of geospatial data and technology to help us really visualize what's going on and where it's happening, is continuing to grow. We're living in a geospatial revolution and the kinds of data sets that are actually available today, compared to 10 years ago, when we started, well 11 years ago we started the GeoCenter, it's incredible. And every day there are more datasets becoming available in the public domain that we can use.nn<strong>Tom Temin:<\/strong> And you tap into the government's own geospatial community, which is governmentwide and military intelligence and civilian.nn<strong>Carrie Stokes: <\/strong>Yes, we do. So there is a federal geospatial community. It's very vibrant, and has representatives, an interagency group from most departments and civilian agencies. And I'm part of that. I represent my own agency in that federal geographic data committee, as it's known. We also have another interagency group known as the U.S. Group on Earth Observations, USGEO for short, because we have an acronym for everything in the federal government, of course. And I represent USAID with that group as well. And Earth Observations kind of sounds like a big fancy term, but it's exactly what it says, it's observing the Earth from space. We also have in situ measurements, as they're called. So if you think about buoys in the ocean, collecting information about ocean temperature, ocean currents, being able to combine these incredible multiple sources of different data sets from around the world helps us better understand and sort of monitor the pulse of the planet. And this is especially important for climate change and understanding how our planet and climate are changing.nn<strong>Tom Temin:<\/strong> Give us an example of say a USAID, suppose a decision is made. We want to give a grant to operators in this country to do this piece of infrastructure, say, and I'm just sort of making it up. How do they bring in the geographic information element, and dear people, help them make that decision? Give us an example of how this all works from a functional bureaucratic standpoint.nn<strong>Carrie Stokes:\u00a0<\/strong>Sure. Well, one of the things that we do, as I mentioned earlier, we have a presence in 70 countries around the world to include many in Africa. And when we are doing our strategic planning, so a five-year plan out as we decide what programs are needed on the ground, we do a major figurative landscape assessment and literal landscape assessment. It's important for us to know what the trends are, what's happening, what's going on on the ground? What we decided to do five or six years ago may no longer be directly relevant today. We may need to pivot. So for example, in East Africa, we worked very closely with our colleagues on the ground as they were planning, they're making their five-year plan. And looking again at these different data sets helped to illuminate this youth bulge as we term it. So the demographic of an age group, age about 15 to 25 is pretty critical group of people because those are the folks who are getting educated, who will be entering the labor force and being active in their own communities, becoming leaders in their own communities. But until we really looked at the data country by country and sort of zoomed out a little bit at the entire region of East Africa and visualized over time, so we can make time series visualizations and see over a 20-year timespan, how are the demographics changing? Get a sense about where are the people? How have they moved just in terms of population density?nnThat information is critical, because we need to know where the people are, are they hovering around a Lake Victoria, for example, and seeing the population growth there, that's a key component of East Africa and the local and regional economy. So when we were looking at things like food security, for example, one of the analyses that we conducted in Uganda with the data showed us that the food security of a household is related to the literacy levels of the girls who live in that household. And this believe it or not, as a geographic issue, we can see where throughout the country, this is the case. So finding connections between sectors is sometimes illuminating for our colleagues, because now we can make decisions about, OK, we were investing a certain amount to improve food security in a particular area, do we also need to ensure that we're investing in improving girls education opportunities in the same area? Of course, we all know that educating girls is important. But did we realize that the data was so strong to show that it affects food security of a household? So these are the kinds of insights that we can gain. Another example in Rwanda that we did was really illuminating. We learned through our geographic analysis that the Protestant community was reproducing and growing much faster than the Catholic community was.nn<strong>Tom Temin:<\/strong> That's a real revelation.nn<strong>Carrie Stokes:\u00a0<\/strong>Well, it is because in our culture with our context, our assumptions might have been the opposite. And this is important because the data and analytics can show us information can show us trends, especially when we can plot it out geographically, that can surprise us. It doesn't replace experts and our expertise and experience, but it can really allow us to question our assumptions, to better understand what's going on, and therefore target our programs effectively in the places where programming has been dismissed.nn<strong>Tom Temin:<\/strong> In other words, it's possible to decide to target development where things aren't happening, if you want them to happen there. Or if you need to serve a population, then that's where you do do the development. But either way, you're not blindly landing these investments.nn<strong>Carrie Stokes:\u00a0<\/strong>Exactly, yes. And in today's world, we have access to this digital data, digital technologies, high resolution satellite imagery, mobile mapping apps on our phones these days. It's just an endless source of information that USAID did not have more than 60 years ago, when we got established.nn<strong>Tom Temin:<\/strong> A good old fashioned topography is still part of the overlay that you need to know, too. Well, you can't put a pumping station there, because it's a 75-degree hill.nn<strong>Carrie Stokes:\u00a0<\/strong>You are correct. Topography is part of geography. Geography is really the study of place, and people and how they interact. So it's extremely important. You wouldn't want for example, to be necessarily digging a well at the top of a mountain, you're going to have to dig far, far down. But if you're looking at a flat map, and haven't realized that there's some topographic relief, with ups and downs, hills and valleys, you might make a decision without really knowing the full situation before you get to that location.nn<strong>Tom Temin:<\/strong> And just briefly, how did you get to this point in your career as the chief geographer there?nn<strong>Carrie Stokes:\u00a0<\/strong>So I started at USAID, this is stating myself now, in the year 2000. And I was a climate change specialist when I first came in. And I came in with a background of having worked in Africa as a Peace Corps volunteer after college. But I also brought with me a background in geographic information systems, GIS as we call it, as well as the climate change science. And I had worked for a period of time for a private tech company that designed its own GIS software and hardware. So when I came into government, I was a little surprised that we weren't using, even this was 22 years ago, the technology that I saw was really unfolding to become more readily available on the internet. So I continued in my work on climate change, then realized we needed to better understand how to track our progress for our investments, not just in our climate programs, but in all the programs that we invest in around the world. And it really made sense to me to start putting this information, collecting the location.nnWe didn't have, at the time a systematic way of collecting the location of our USAID activities. We today do, we have policies in place, we are getting the IT infrastructure in place to be able to collect this and be able to analyze it. But that process was for me evidence that it was a gap. And today we're filling that gap. So I initiated some efforts to start mapping the work that we do, and showing colleagues the power of that, and got involved in the interagency geospatial community as well. Started a program with NASA, our nation's space agency, where by USAID, the development agency lead for our U.S. government and our space agency combined forces to create a program called SERVIR. And the SERVIR program is still going strong to this day, 17 years, I believe, at this point. And the idea behind that was to empower our colleagues in developing countries to get access to Earth observation information, mapping technologies, to help their ministries of health ministries of environment, disaster response ministries, agriculture ministries, and allow them to benefit from some of the same technologies and that we have in this data rich country of the U.S.nnSo that interaction with NASA, combining the climate change work and the geospatial technology work really led me to decide, OK, now we need to focus on empowering our own staff in USAID. So the GeoCenter that I established about 11 years ago was really like a small, SERVIR hub, as we would say. It was just embedded in the walls of our own headquarters office in Washington, D.C., so that our own staff could get access to the same kinds of technologies and data sets that we were promoting, with our colleagues in the many countries USAID works.nn<strong>Tom Temin:<\/strong> Sure. Sounds like you really liked this work?nn<strong>Carrie Stokes:\u00a0<\/strong>I do. It's great. Geography is the study of everything everywhere, so it's never dull. And my team and I get to work across many sectors. And we get to work across many geographies. And what we learn in sort of the big picture analytics helps us really zoom into local scales when we're working with local communities and our colleagues in country to take lessons learned from one part of the world that we've seen might be relevant to another part of the world. So it's a fantastic opportunity, and I love what I do.nn<strong>Tom Temin:<\/strong> Carrie Stokes is chief geographer at the U.S. Agency for International Development. Thanks so much for joining me.nn<strong>Carrie Stokes: <\/strong>Thank you for having me.<\/blockquote>"}};

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

Geographic information is crucial to agency missions almost everywhere. And it’s especially true for the U.S. Agency for International Development. It operates throughout the world and for how geo and geo information systems underlie development decisions, the Federal Drive with Tom Temin turned to the USAID’s chief geographer Carrie Stokes.

Interview transcript:

Tom Temin: Ms. Stokes, good to have you on.

Carrie Stokes: Good to be here. Thank you.

Tom Temin: So tell us first how geographic information does underlie what USAID does. I mean, you have locations, countries around the world where development work occurs and is overseen and funded by USAID but how does specific geographic information figure into all of this?

Carrie Stokes: Well, as you know, USAID works in about 100 countries around the world. And we are focused on improving the lives of people in those countries. We work in many different sectors so everything from health and education, conflict and stabilization, environment and climate change, the list goes on and on with the many sectors we work in. So the geographic approach to development is an important one that I work to promote with a team of geographers and data analysts in the GeoCenter that I lead. And the idea here is to get a full picture about what’s happening in the countries where we work, not just in a sector-based silo approach that is typically the way many organizations operate. So what we do with geographic information is focused on where the development need is concentrated. We look at where we are already working as an agency. And we like to understand how effective our programs are. So all of that means we need geographic information, we need to understand the geography of the places where we’re working in the scale that we’re trying to target, where the communities and the people are that we’re working with.

Tom Temin: And is your geographic information simply pins in a map? Because the way describe it, it could be just well, there’s a poor area here, let’s put a pin there. And then you’ve got a nice push pin map of the country.

Carrie Stokes: In a simple form, the short answer would be yes, pins in a map are something that everybody’s really familiar with these days because we all use our mobile devices. And it’s easily accessible to see digital maps in this modern day that we all live in. But the work that the geographers and data analysts on my team does is a bit more detail than just visualizing where projects are based. We do analytics, and we take data sets that are disparate from many different places. So we’ll want to know, of course, where a project may be based, a health project, for example. But we want to overlay that, meaning we want to combine that information with what is the actual status of people’s health in that particular community or that particular district of a country. So we will look at things like mortality rates, we’ll look at what birth rates are like, we’ll look at distance that people have to get to a health clinic, for example, do they have access to basic health services? So we want to look at the spatial extent of the geographic influences that factor into people’s daily lives. So yes, there may be a pushpin here or there, but we’re also trying to get the context in which people live. So bringing in the different kinds of data sets gives us a fuller picture of what’s going on.

Tom Temin: Now, in the United States, there is a rich array of geographic information on almost everything conceivable down to the sewer line levels, that’s easily obtainable. What about some of the countries that are less developed, maybe not quite as open to society, perhaps as the United States? What are your sources of information, so that they’re willing to say yes, this is the death rate or the birth rate, or whatever it is, you want to overlay? How do you get that information?

Carrie Stokes: Well, we’re sleuths. We’re geographers, we’re good at looking for information that we know will be representative of people and a place. We look at demographic data. So the kind of data that’s collected in a census for example, some countries conduct their own in digital form, and they publish that information. And we will use that if we can’t get it directly from the country itself, there are other sources of household survey information that we really value. The World Bank publishes some of these. USAID, for whom I work, has been publishing an incredible demographic and health survey data set for almost 30 years or more. And these are very valuable pieces of information to combine information about people’s livelihoods, their health status, and then we also look at the biophysical information on the ground.

So we look using satellite imagery sometimes. We will look at high resolution imagery and zoom into a particular area of interest. That helps us better understand what the actual physical geography is like. So when we get to combine physical geography with human geography, as we say, it really allows us to ask deeper questions about what’s driving human behavior, what may be some of the factors affecting people’s decisions about where they get their food, if there’s a food insecurity situation going on. So we’re very creative about data. But we’ve also tapped into the open mapping data movement, as I will call it. There is a platform called OpenStreetMap and we use it. It empowers people who have access to the internet to create data that represents their own local communities geospatial data. And this data is made available to anyone who uses the platform. So we make use of this as well, we contribute to it, and we use it. But half the challenge for us when we’re going to analyze a particular area is in fact getting reliable, trustworthy data that we know is representative and timely.

Tom Temin: We’re speaking with Carrie Stokes, she’s chief geographer at the U.S. Agency for International Development. So it sounds like the importance of geo means that you probably help inform what surveys USAID does in the first place, in areas where it’s able to do surveys.

Carrie Stokes: We do and we feel it’s important when users of information have an opportunity to influence what information actually gets collected. So the user base of this information is growing and growing. It’s not just me and my team, we have a geospatial Community of Practice as we say, all throughout USAID. We have about 165 people, that’s pretty incredible given that we’re not a mapping agency, we’re a development agency. But the power of geospatial data and technology to help us really visualize what’s going on and where it’s happening, is continuing to grow. We’re living in a geospatial revolution and the kinds of data sets that are actually available today, compared to 10 years ago, when we started, well 11 years ago we started the GeoCenter, it’s incredible. And every day there are more datasets becoming available in the public domain that we can use.

Tom Temin: And you tap into the government’s own geospatial community, which is governmentwide and military intelligence and civilian.

Carrie Stokes: Yes, we do. So there is a federal geospatial community. It’s very vibrant, and has representatives, an interagency group from most departments and civilian agencies. And I’m part of that. I represent my own agency in that federal geographic data committee, as it’s known. We also have another interagency group known as the U.S. Group on Earth Observations, USGEO for short, because we have an acronym for everything in the federal government, of course. And I represent USAID with that group as well. And Earth Observations kind of sounds like a big fancy term, but it’s exactly what it says, it’s observing the Earth from space. We also have in situ measurements, as they’re called. So if you think about buoys in the ocean, collecting information about ocean temperature, ocean currents, being able to combine these incredible multiple sources of different data sets from around the world helps us better understand and sort of monitor the pulse of the planet. And this is especially important for climate change and understanding how our planet and climate are changing.

Tom Temin: Give us an example of say a USAID, suppose a decision is made. We want to give a grant to operators in this country to do this piece of infrastructure, say, and I’m just sort of making it up. How do they bring in the geographic information element, and dear people, help them make that decision? Give us an example of how this all works from a functional bureaucratic standpoint.

Carrie Stokes: Sure. Well, one of the things that we do, as I mentioned earlier, we have a presence in 70 countries around the world to include many in Africa. And when we are doing our strategic planning, so a five-year plan out as we decide what programs are needed on the ground, we do a major figurative landscape assessment and literal landscape assessment. It’s important for us to know what the trends are, what’s happening, what’s going on on the ground? What we decided to do five or six years ago may no longer be directly relevant today. We may need to pivot. So for example, in East Africa, we worked very closely with our colleagues on the ground as they were planning, they’re making their five-year plan. And looking again at these different data sets helped to illuminate this youth bulge as we term it. So the demographic of an age group, age about 15 to 25 is pretty critical group of people because those are the folks who are getting educated, who will be entering the labor force and being active in their own communities, becoming leaders in their own communities. But until we really looked at the data country by country and sort of zoomed out a little bit at the entire region of East Africa and visualized over time, so we can make time series visualizations and see over a 20-year timespan, how are the demographics changing? Get a sense about where are the people? How have they moved just in terms of population density?

That information is critical, because we need to know where the people are, are they hovering around a Lake Victoria, for example, and seeing the population growth there, that’s a key component of East Africa and the local and regional economy. So when we were looking at things like food security, for example, one of the analyses that we conducted in Uganda with the data showed us that the food security of a household is related to the literacy levels of the girls who live in that household. And this believe it or not, as a geographic issue, we can see where throughout the country, this is the case. So finding connections between sectors is sometimes illuminating for our colleagues, because now we can make decisions about, OK, we were investing a certain amount to improve food security in a particular area, do we also need to ensure that we’re investing in improving girls education opportunities in the same area? Of course, we all know that educating girls is important. But did we realize that the data was so strong to show that it affects food security of a household? So these are the kinds of insights that we can gain. Another example in Rwanda that we did was really illuminating. We learned through our geographic analysis that the Protestant community was reproducing and growing much faster than the Catholic community was.

Tom Temin: That’s a real revelation.

Carrie Stokes: Well, it is because in our culture with our context, our assumptions might have been the opposite. And this is important because the data and analytics can show us information can show us trends, especially when we can plot it out geographically, that can surprise us. It doesn’t replace experts and our expertise and experience, but it can really allow us to question our assumptions, to better understand what’s going on, and therefore target our programs effectively in the places where programming has been dismissed.

Tom Temin: In other words, it’s possible to decide to target development where things aren’t happening, if you want them to happen there. Or if you need to serve a population, then that’s where you do do the development. But either way, you’re not blindly landing these investments.

Carrie Stokes: Exactly, yes. And in today’s world, we have access to this digital data, digital technologies, high resolution satellite imagery, mobile mapping apps on our phones these days. It’s just an endless source of information that USAID did not have more than 60 years ago, when we got established.

Tom Temin: A good old fashioned topography is still part of the overlay that you need to know, too. Well, you can’t put a pumping station there, because it’s a 75-degree hill.

Carrie Stokes: You are correct. Topography is part of geography. Geography is really the study of place, and people and how they interact. So it’s extremely important. You wouldn’t want for example, to be necessarily digging a well at the top of a mountain, you’re going to have to dig far, far down. But if you’re looking at a flat map, and haven’t realized that there’s some topographic relief, with ups and downs, hills and valleys, you might make a decision without really knowing the full situation before you get to that location.

Tom Temin: And just briefly, how did you get to this point in your career as the chief geographer there?

Carrie Stokes: So I started at USAID, this is stating myself now, in the year 2000. And I was a climate change specialist when I first came in. And I came in with a background of having worked in Africa as a Peace Corps volunteer after college. But I also brought with me a background in geographic information systems, GIS as we call it, as well as the climate change science. And I had worked for a period of time for a private tech company that designed its own GIS software and hardware. So when I came into government, I was a little surprised that we weren’t using, even this was 22 years ago, the technology that I saw was really unfolding to become more readily available on the internet. So I continued in my work on climate change, then realized we needed to better understand how to track our progress for our investments, not just in our climate programs, but in all the programs that we invest in around the world. And it really made sense to me to start putting this information, collecting the location.

We didn’t have, at the time a systematic way of collecting the location of our USAID activities. We today do, we have policies in place, we are getting the IT infrastructure in place to be able to collect this and be able to analyze it. But that process was for me evidence that it was a gap. And today we’re filling that gap. So I initiated some efforts to start mapping the work that we do, and showing colleagues the power of that, and got involved in the interagency geospatial community as well. Started a program with NASA, our nation’s space agency, where by USAID, the development agency lead for our U.S. government and our space agency combined forces to create a program called SERVIR. And the SERVIR program is still going strong to this day, 17 years, I believe, at this point. And the idea behind that was to empower our colleagues in developing countries to get access to Earth observation information, mapping technologies, to help their ministries of health ministries of environment, disaster response ministries, agriculture ministries, and allow them to benefit from some of the same technologies and that we have in this data rich country of the U.S.

So that interaction with NASA, combining the climate change work and the geospatial technology work really led me to decide, OK, now we need to focus on empowering our own staff in USAID. So the GeoCenter that I established about 11 years ago was really like a small, SERVIR hub, as we would say. It was just embedded in the walls of our own headquarters office in Washington, D.C., so that our own staff could get access to the same kinds of technologies and data sets that we were promoting, with our colleagues in the many countries USAID works.

Tom Temin: Sure. Sounds like you really liked this work?

Carrie Stokes: I do. It’s great. Geography is the study of everything everywhere, so it’s never dull. And my team and I get to work across many sectors. And we get to work across many geographies. And what we learn in sort of the big picture analytics helps us really zoom into local scales when we’re working with local communities and our colleagues in country to take lessons learned from one part of the world that we’ve seen might be relevant to another part of the world. So it’s a fantastic opportunity, and I love what I do.

Tom Temin: Carrie Stokes is chief geographer at the U.S. Agency for International Development. Thanks so much for joining me.

Carrie Stokes: Thank you for having me.

]]>
https://federalnewsnetwork.com/open-datatransparency/2022/03/a-lot-of-what-usaid-does-relies-on-this-geographer/feed/ 0
Innovation for IRS customer experience hangs on cost effectiveness https://federalnewsnetwork.com/it-modernization/2022/03/innovation-for-irs-customer-experience-hangs-on-cost-effectiveness/ https://federalnewsnetwork.com/it-modernization/2022/03/innovation-for-irs-customer-experience-hangs-on-cost-effectiveness/#respond Wed, 02 Mar 2022 20:04:42 +0000 https://federalnewsnetwork.com/?p=3937975 A Form 1040 tax return may be worth the cost of digitization but what about a more obscure document? That may be cheaper to keep in paper form.

This is a question the IRS considers when it decides to fund innovative efforts. The agency needs to be economical as it tries to improve customer experience, as the December executive order put pressure on agencies to do. Although some might advocate making all 1,439 IRS forms digital, that may be less practical because the lifecycles of those forms vary depending on how taxpayers and businesses use them.

Harrison Smith, co-director of the IRS Enterprise Digitalization and Case Management Office, said the enterprise perspective goes back to mission, stitching together funding and policies to optimize the agency’s approach. Pilot programs can sometimes be as small as 30 days long and $25,000, which Smith described as miniscule for federal government. But if the pilot works, they can keep funding it.

“Within those pilots we keep the approach, the solution very, very broad. We try to dictate as little as possible, and simply say things like, ‘We want to be able to scan pieces of paper and create digital images. We don’t care how you do it, we don’t care the device,’” he said during a webinar hosted by GovExec on Feb. 23. He clarified that cybersecurity and taxpayer data protection requirements apply. “Show me that you can do it. Prove that you can, frankly, do the proposal that you submit, and then we’ll talk about, great, how does this fit within our architecture? How does this fit within our policy approach? How does this fit within our business processes?”

From there, IRS can continue to fund so long as the pilots continue demonstrating process and progress, Smith said. He explained that shifting them from paper to digital brings on costs such as gathering data and storing it.

At the same time, reduced reliance on paper and increased access to machine-readable data are in the IRS’ digitalization strategy. Smith said the agency must help industry partners understand what they want to accomplish.

“We in the government, I think we’ve gotten a lot better. Reverse industry days, the proliferation of industry liaison roles, these types of things. But really making sure that we’re willing to talk with our industry partners and really understand and benefit from both of us, all of really, benefit from that conversation,” he said.

He described learning the hard way why it counts to clearly communicate to industry partners. Early in his career, while leading a procurement with a 60-day bid time he brushed off questions from curious vendors in an effort to “make sure that everybody gets the same information at the same time. And so I didn’t say anything, like at all to anybody. Nope, no information,” he said. This in turn made it hard for interested partners to submit their bids as they tended to plan six-to-18 months in advance.

“Be transparent and acknowledge challenges, create partnerships, find balance and be kind, because that’s really the essence of what we do,” he said.

Olivia Peterson, AWS Federal Financial Services Leader, agreed that keeping vendors at arms length hinders trust. But she described a trend of direct engagement at the top levels that was promising.

Peterson said the unexpected financial events of the past few years show the need to respond to sudden demands. The IRS knows too well the struggle to innovate in response to new modernization needs, as the agency has was hit with compounding pressures due to the pandemic: Chronic understaffing, demand for digital services while offices were largely closed or empty, and the distribution of relief funding.

For AWS, that means speed and agility to meet the demand is key. She said they use the concept of press releases to ask themselves why a product or service should matter to end users on a basic level. Then they write an FAQ to reflect deeper on the product’s challenges or business value.

“We iterate on those processes, and actually I Iook at that across each of our customer segments too is what is the press release that we want to have around customer experience for the IRS? Let’s really ideate on that together and look at the FAQs,” she said. “And then you get into solutioning and making sure that that’s the priority and how it fits.”

]]>
https://federalnewsnetwork.com/it-modernization/2022/03/innovation-for-irs-customer-experience-hangs-on-cost-effectiveness/feed/ 0
Don’t throw out your 8 mm movies, the Library of Congress may want them https://federalnewsnetwork.com/technology-main/2022/02/dont-throw-out-your-8-mm-movies-the-library-of-congress-may-want-them/ https://federalnewsnetwork.com/technology-main/2022/02/dont-throw-out-your-8-mm-movies-the-library-of-congress-may-want-them/#respond Thu, 24 Feb 2022 20:11:19 +0000 https://federalnewsnetwork.com/?p=3928163 var config_3927972 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/dts.podtrac.com\/redirect.mp3\/pdst.fm\/e\/chrt.fm\/track\/E2G895\/aw.noxsolutions.com\/launchpod\/federal-drive\/mp3\/022422_Mashon_web_kxwu_a690adf9.mp3?awCollectionId=1146&awEpisodeId=bec16354-1995-4c28-ae77-2683a690adf9&awNetwork=322"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2018\/12\/FD1500-150x150.jpg","title":"Don’t throw out your 8 millimeter movies. The Library of Congress may want them","description":"[hbidcpodcast podcastid='3927972']nn<em>Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive\u2019s daily audio interviews on\u00a0<a href="https:\/\/itunes.apple.com\/us\/podcast\/federal-drive-with-tom-temin\/id1270799277?mt=2">Apple Podcasts<\/a>\u00a0or\u00a0<a href="https:\/\/www.podcastone.com\/federal-drive-with-tom-temin?pid=1753589">PodcastOne<\/a>.<\/em>nnMike Mashon might have one of the coolest jobs in the federal government. He's head of the Moving Image section at the Library of Congress. Recently, the section posted a digitized version of something never seen before: A home movie of a famous rock concert from 1969. He spoke to the <a href="https:\/\/federalnewsnetwork.com\/category\/temin\/tom-temin-federal-drive\/"><em><strong>Federal Drive with Tom Temin<\/strong><\/em><\/a> to tell about his work, and about that grainy movie.nn<em>Interview transcript:<\/em>n<blockquote><strong>Tom Temin:<\/strong> Mr. Mashon, good to have you on.nn<strong>Mike Mashon: <\/strong>Hey, Tom, very nice to be here, thanks.nn<strong>Tom Temin:<\/strong> First of all, tell us the bigger picture of what happens in the Moving Image section. I guess they don't call it movies anymore. An image kind of brings in digital and film and everything that was ever recorded with light, I guess.nn<strong>Mike Mashon: <\/strong>Right. So the Library of Congress is home to the largest collection of film and video in the world. We have about 1.7 million individual items, physical items in our collection, and ever-growing collection of digital as well. So we have been collecting film here since the 1890s. And we still get it in through copyright and gift and purchase today. So my section is responsible for acquiring the material for describing it doing any conservation rehousing we need, we have good storage for it. And we also established the preservation priorities for that collection. It's a big job.nn<strong>Tom Temin:<\/strong> And when you have motion picture film, from a movie camera from emulsion, light sensitive film, do you generally as a matter of course, digitize that, since film eventually breaks down and dissolves?nn<strong>Mike Mashon: <\/strong>Yeah, these days, we do digitize the film. But we were always founded and we still have the capability of preserving film on film. We are one of the few places that can still photo chemically preserve motion pictures. So we still have the ability to create, for example, 35 mm prints from nitrate film originals.nn<strong>Tom Temin:<\/strong> And there are still filmmakers working today that use motion picture film, correct, the traditional film?nn<strong>Mike Mashon: <\/strong>Oh, absolutely. Yes, there's still a handful out there. We're a sort of "film forever" kind of people, but we're not Luddites eithere. So we definitely have a tremendous number of digital workflows available to us in our facility.nn<strong>Tom Temin:<\/strong> Yeah, it's kind of a cultural thing. I mean, it takes getting used to, to look at a full budget motion picture, I don't go to the movie theater very much but to see videotape, that's something that has been, not to make upon but "ingrained on people" to see the brain going by. And I guess someday we'll get used to not seeing it.nn<strong>Mike Mashon: <\/strong>You know it is a little interesting sometimes to go to a theater and you're watching typically a digital cinema package. And those things, they're very cleaned up, you don't see the grain in the film anymore. And for people like me, sometimes that can be a little disconcerting.nn<strong>Tom Temin:<\/strong> And what is the process by which the library decides this film should be preserved in perpetuity, this one may be not so much?nn<strong>Mike Mashon: <\/strong>Excellent question, we really do strive to preserve all of it, frankly, particularly in terms of the video that we have in the collection, I do want to make it clear, we have a lot of videotape in our collection as well. We have some ways of doing that in robots, where we can push videotape through robots and do tremendous amounts of digitization, like 20,000 tapes a year, on average. Film is going to be a little different. We have a lot of reels of film in the collection. And you're right, we do have to make some decisions on what is going to be preserved. We have good preservation storage. So the films that we have are stored in cold and dry conditions. So we're able to sort of slow down their deterioration until we can get to them. But a lot of times the decisions that we're making are going to be based on the physical condition of the material. So if the film has really started to deteriorate, we want to make sure that we can scan it as quickly as possible. But we also have a very robust loan program here. There are still a good number of theaters out there that are showing 35 mm film and we can still make 35 mm prints in addition to making digital cinema packages here. So there are films that we know that will be shown in theater. We also have a lot of our films available online through something called the National Screening Room. And we'll make sure that those films get sent up to the laboratory for scanning.nn<strong>Tom Temin:<\/strong> We're speaking with Mike Mashon, he is the head of the Moving Image section at the Library of Congress. And do you work with the National Archives because they have some film like I think they have the Zapruder film, for example. And it has to be also preserved in that same manner. So how does that interaction happen?nn<strong>Mike Mashon: <\/strong>Oh, we have a lot of interaction with other federal agencies and National Archives being a primary one. We're also very much involved in in Initiative's with the National Archives in terms of setting standards for digitization. There's a federal group that works on that. People will frequently ask me, What's the difference between our collection and National Archives. National Archives is responsible for films that were produced by the government. And we have a lot of those in our collection. But we collect even more broadly than that. It's the reason why you're going to find a lot of Hollywood films, home movies, educational films, in our collection, but we worked really closely with the National Archives.nn<strong>Tom Temin:<\/strong> Now this recent film that came to light was an amateur-shot film of a famous concert or a concert at which a famous and kind of unfortunate event happened. Tell us about that the Altamont concert there, The Rolling Stones are in there and some other famous artists of the day. What's the story behind that film and who shot it and how did it come into the Library of Congress?nn<strong>Mike Mashon: <\/strong>Well, in some ways, the story of finding that film is as interesting as the film, at least from the archival perspective. So one of my colleagues, a technician named John Snelson, is going through a collection of film processing it that we received from a man named Rick Prelinger many years ago, Rick, collect a lot of films, very well known in the archival field. And it's a massive collection of well over 150,000 reels of film. So John is just sort of going through the Prelinger collection. And he just called my attention. Every once in a while, he would come across something, he would just call my attention to it. And he said, I got this film, and title of it is "Stones in the Park," and I -nn<strong>Tom Temin:<\/strong> That could be anything.nn<strong>Mike Mashon: <\/strong>It really kind of could well be for me, what it triggered was, I knew that the Rolling Stones had actually made a film of a concert they did in Hyde Park in July of 1969, not long after the guitarist Brian Jones had died. So that had been filmed and released as "Stones in the Park." But what John had turned up was an 8 mm film, and 8 mm is a home movie format, so I wasn't really sure what it was. And I just went ahead and sent it up to laboratory. I kind of figured, OK, we're gonna want to know what this thing is anyhow. So I put in a digitization order, it goes up to the laboratory, and a few days later, I get a call from the lab. And the guys up there are like "Mike, you might want to come see this." So I run upstairs, and here they're playing the file for me. And it's Altamont. The Altamont Free Concert was Dec. 6, 1969. Very famous concert memorialized in the film, "Gimme Shelter." But this was clearly home movie footage shot by somebody right up by the stage. It's silent. There's no sound with this. But you know, you see some acts who aren't in Gimme Shelter. You've got Santana and the Flying Burrito Brothers with Graham Parsons, and Crosby, Stills, Nash and Young are in it, in the home movie, in addition to footage of the Stones and their evening performance.nn<strong>Tom Temin:<\/strong> And that concept became famous, it was in California at a racetrack, and that was where they hired the Hells Angels as bodyguards, and a murder occurred in the audience.nn<strong>Mike Mashon: <\/strong>Yeah, and there's nothing, I mean, you can see the kind of mayhem that's breaking out, as Hells Angels are roaming the stage. But yeah, a concertgoer, Meredith Hunter was killed by a Hells Angel during the Stones set. You don't see anything like that in this home movie. But you asked do we know who the camera person is? We do not. We would very much like to find out who shot this because I will say the film that Rick Prelinger had acquired was from a company called Palmer Films, which was a laboratory in San Francisco that went out of business. And when Palmer Films went out of business, Rick came in, scooped up all their films, added it to his collection, and then they came to us. So Palmer is no longer extant. This film is abandoned at Palmer and so we consider it to be an orphan work. We just simply, we don't know who owns it.nn<strong>Tom Temin:<\/strong> Well, now it's out for the public to see and maybe someone will come forward, "Hey, I was there. I shot that with my DeJur camera or my Bolex."nn<strong>Mike Mashon: <\/strong>I'd love it.nn<strong>Tom Temin:<\/strong> And just while we have you, what is your background? Do you come to this as a film content, artistic person or as a technical preservation format type of guy or how do you come to this job?nn<strong>Mike Mashon: <\/strong>I'm a subject matter expert. I always defer to the technical people on this. My professional background actually started as an immunologist, but I really, really love movies and TV. So I went back to school and got a Ph.D. in radio, television and film, and I started at the library 24 years ago as the moving image curator. So I'm very much more on the subject matter side than anything else.nn<strong>Tom Temin:<\/strong> Well, I've got my own home movies in 8 mm and later and Super 8. If you'd like to have them, they're welcome to go into the library. But I have a feeling it probably doesn't quite meet the threshold.nn<strong>Mike Mashon: <\/strong>Oh, no, you would be wrong about that, Tom. We actually very much like home movies around here. We have a lot, a lot of home movies in our collection. And some of them actually date back to the early 1900s. So it's a pretty remarkable collection. And look, not all of them are going to be Altamont. But look, we have home movies from people who took their vacations in Germany in the mid-1930s.nn<strong>Tom Temin:<\/strong> Wow!nn<strong>Mike Mashon: <\/strong>It's really fascinating stuff. But yes, we also have my home movies as well.nn<strong>Tom Temin:<\/strong> Alright, well if you want to see a 3 year old Tom Temin sneezing silently on the beach because I was allergic to everything, in Atlantic City, it's available.nn<strong>Mike Mashon: <\/strong>Fantastic.nn<strong>Tom Temin:<\/strong> Mike Mashon is head of the Moving Image section at the Library of Congress. Thanks so much for joining me.nn<strong>Mike Mashon: <\/strong>I really enjoyed it, Tom. Thanks a lot.<\/blockquote>"}};

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

Mike Mashon might have one of the coolest jobs in the federal government. He’s head of the Moving Image section at the Library of Congress. Recently, the section posted a digitized version of something never seen before: A home movie of a famous rock concert from 1969. He spoke to the Federal Drive with Tom Temin to tell about his work, and about that grainy movie.

Interview transcript:

Tom Temin: Mr. Mashon, good to have you on.

Mike Mashon: Hey, Tom, very nice to be here, thanks.

Tom Temin: First of all, tell us the bigger picture of what happens in the Moving Image section. I guess they don’t call it movies anymore. An image kind of brings in digital and film and everything that was ever recorded with light, I guess.

Mike Mashon: Right. So the Library of Congress is home to the largest collection of film and video in the world. We have about 1.7 million individual items, physical items in our collection, and ever-growing collection of digital as well. So we have been collecting film here since the 1890s. And we still get it in through copyright and gift and purchase today. So my section is responsible for acquiring the material for describing it doing any conservation rehousing we need, we have good storage for it. And we also established the preservation priorities for that collection. It’s a big job.

Tom Temin: And when you have motion picture film, from a movie camera from emulsion, light sensitive film, do you generally as a matter of course, digitize that, since film eventually breaks down and dissolves?

Mike Mashon: Yeah, these days, we do digitize the film. But we were always founded and we still have the capability of preserving film on film. We are one of the few places that can still photo chemically preserve motion pictures. So we still have the ability to create, for example, 35 mm prints from nitrate film originals.

Tom Temin: And there are still filmmakers working today that use motion picture film, correct, the traditional film?

Mike Mashon: Oh, absolutely. Yes, there’s still a handful out there. We’re a sort of “film forever” kind of people, but we’re not Luddites eithere. So we definitely have a tremendous number of digital workflows available to us in our facility.

Tom Temin: Yeah, it’s kind of a cultural thing. I mean, it takes getting used to, to look at a full budget motion picture, I don’t go to the movie theater very much but to see videotape, that’s something that has been, not to make upon but “ingrained on people” to see the brain going by. And I guess someday we’ll get used to not seeing it.

Mike Mashon: You know it is a little interesting sometimes to go to a theater and you’re watching typically a digital cinema package. And those things, they’re very cleaned up, you don’t see the grain in the film anymore. And for people like me, sometimes that can be a little disconcerting.

Tom Temin: And what is the process by which the library decides this film should be preserved in perpetuity, this one may be not so much?

Mike Mashon: Excellent question, we really do strive to preserve all of it, frankly, particularly in terms of the video that we have in the collection, I do want to make it clear, we have a lot of videotape in our collection as well. We have some ways of doing that in robots, where we can push videotape through robots and do tremendous amounts of digitization, like 20,000 tapes a year, on average. Film is going to be a little different. We have a lot of reels of film in the collection. And you’re right, we do have to make some decisions on what is going to be preserved. We have good preservation storage. So the films that we have are stored in cold and dry conditions. So we’re able to sort of slow down their deterioration until we can get to them. But a lot of times the decisions that we’re making are going to be based on the physical condition of the material. So if the film has really started to deteriorate, we want to make sure that we can scan it as quickly as possible. But we also have a very robust loan program here. There are still a good number of theaters out there that are showing 35 mm film and we can still make 35 mm prints in addition to making digital cinema packages here. So there are films that we know that will be shown in theater. We also have a lot of our films available online through something called the National Screening Room. And we’ll make sure that those films get sent up to the laboratory for scanning.

Tom Temin: We’re speaking with Mike Mashon, he is the head of the Moving Image section at the Library of Congress. And do you work with the National Archives because they have some film like I think they have the Zapruder film, for example. And it has to be also preserved in that same manner. So how does that interaction happen?

Mike Mashon: Oh, we have a lot of interaction with other federal agencies and National Archives being a primary one. We’re also very much involved in in Initiative’s with the National Archives in terms of setting standards for digitization. There’s a federal group that works on that. People will frequently ask me, What’s the difference between our collection and National Archives. National Archives is responsible for films that were produced by the government. And we have a lot of those in our collection. But we collect even more broadly than that. It’s the reason why you’re going to find a lot of Hollywood films, home movies, educational films, in our collection, but we worked really closely with the National Archives.

Tom Temin: Now this recent film that came to light was an amateur-shot film of a famous concert or a concert at which a famous and kind of unfortunate event happened. Tell us about that the Altamont concert there, The Rolling Stones are in there and some other famous artists of the day. What’s the story behind that film and who shot it and how did it come into the Library of Congress?

Mike Mashon: Well, in some ways, the story of finding that film is as interesting as the film, at least from the archival perspective. So one of my colleagues, a technician named John Snelson, is going through a collection of film processing it that we received from a man named Rick Prelinger many years ago, Rick, collect a lot of films, very well known in the archival field. And it’s a massive collection of well over 150,000 reels of film. So John is just sort of going through the Prelinger collection. And he just called my attention. Every once in a while, he would come across something, he would just call my attention to it. And he said, I got this film, and title of it is “Stones in the Park,” and I –

Tom Temin: That could be anything.

Mike Mashon: It really kind of could well be for me, what it triggered was, I knew that the Rolling Stones had actually made a film of a concert they did in Hyde Park in July of 1969, not long after the guitarist Brian Jones had died. So that had been filmed and released as “Stones in the Park.” But what John had turned up was an 8 mm film, and 8 mm is a home movie format, so I wasn’t really sure what it was. And I just went ahead and sent it up to laboratory. I kind of figured, OK, we’re gonna want to know what this thing is anyhow. So I put in a digitization order, it goes up to the laboratory, and a few days later, I get a call from the lab. And the guys up there are like “Mike, you might want to come see this.” So I run upstairs, and here they’re playing the file for me. And it’s Altamont. The Altamont Free Concert was Dec. 6, 1969. Very famous concert memorialized in the film, “Gimme Shelter.” But this was clearly home movie footage shot by somebody right up by the stage. It’s silent. There’s no sound with this. But you know, you see some acts who aren’t in Gimme Shelter. You’ve got Santana and the Flying Burrito Brothers with Graham Parsons, and Crosby, Stills, Nash and Young are in it, in the home movie, in addition to footage of the Stones and their evening performance.

Tom Temin: And that concept became famous, it was in California at a racetrack, and that was where they hired the Hells Angels as bodyguards, and a murder occurred in the audience.

Mike Mashon: Yeah, and there’s nothing, I mean, you can see the kind of mayhem that’s breaking out, as Hells Angels are roaming the stage. But yeah, a concertgoer, Meredith Hunter was killed by a Hells Angel during the Stones set. You don’t see anything like that in this home movie. But you asked do we know who the camera person is? We do not. We would very much like to find out who shot this because I will say the film that Rick Prelinger had acquired was from a company called Palmer Films, which was a laboratory in San Francisco that went out of business. And when Palmer Films went out of business, Rick came in, scooped up all their films, added it to his collection, and then they came to us. So Palmer is no longer extant. This film is abandoned at Palmer and so we consider it to be an orphan work. We just simply, we don’t know who owns it.

Tom Temin: Well, now it’s out for the public to see and maybe someone will come forward, “Hey, I was there. I shot that with my DeJur camera or my Bolex.”

Mike Mashon: I’d love it.

Tom Temin: And just while we have you, what is your background? Do you come to this as a film content, artistic person or as a technical preservation format type of guy or how do you come to this job?

Mike Mashon: I’m a subject matter expert. I always defer to the technical people on this. My professional background actually started as an immunologist, but I really, really love movies and TV. So I went back to school and got a Ph.D. in radio, television and film, and I started at the library 24 years ago as the moving image curator. So I’m very much more on the subject matter side than anything else.

Tom Temin: Well, I’ve got my own home movies in 8 mm and later and Super 8. If you’d like to have them, they’re welcome to go into the library. But I have a feeling it probably doesn’t quite meet the threshold.

Mike Mashon: Oh, no, you would be wrong about that, Tom. We actually very much like home movies around here. We have a lot, a lot of home movies in our collection. And some of them actually date back to the early 1900s. So it’s a pretty remarkable collection. And look, not all of them are going to be Altamont. But look, we have home movies from people who took their vacations in Germany in the mid-1930s.

Tom Temin: Wow!

Mike Mashon: It’s really fascinating stuff. But yes, we also have my home movies as well.

Tom Temin: Alright, well if you want to see a 3 year old Tom Temin sneezing silently on the beach because I was allergic to everything, in Atlantic City, it’s available.

Mike Mashon: Fantastic.

Tom Temin: Mike Mashon is head of the Moving Image section at the Library of Congress. Thanks so much for joining me.

Mike Mashon: I really enjoyed it, Tom. Thanks a lot.

]]>
https://federalnewsnetwork.com/technology-main/2022/02/dont-throw-out-your-8-mm-movies-the-library-of-congress-may-want-them/feed/ 0
DHS privacy chief aims to promote ‘privacy enhancing technologies’ https://federalnewsnetwork.com/agency-oversight/2022/02/dhs-privacy-chief-aims-to-promote-privacy-enhancing-technologies/ https://federalnewsnetwork.com/agency-oversight/2022/02/dhs-privacy-chief-aims-to-promote-privacy-enhancing-technologies/#respond Thu, 17 Feb 2022 17:57:44 +0000 https://federalnewsnetwork.com/?p=3916088 var config_3915990 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/dts.podtrac.com\/redirect.mp3\/pdst.fm\/e\/chrt.fm\/track\/E2G895\/aw.noxsolutions.com\/launchpod\/federal-drive\/mp3\/021722_Justin_web_5tpu_38159f36.mp3?awCollectionId=1146&awEpisodeId=c26c5102-8933-43e0-941d-4c2438159f36&awNetwork=322"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2018\/12\/FD1500-150x150.jpg","title":"Chief privacy officer at DHS wants to move privacy up the to-do list","description":"[hbidcpodcast podcastid='3915990']nn<em>Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive\u2019s daily audio interviews on\u00a0<a href="https:\/\/itunes.apple.com\/us\/podcast\/federal-drive-with-tom-temin\/id1270799277?mt=2">Apple Podcasts<\/a>\u00a0or\u00a0<a href="https:\/\/www.podcastone.com\/federal-drive-with-tom-temin?pid=1753589">PodcastOne<\/a>.<\/em>nnThe Department of Homeland Security\u2019s chief privacy officer wants to make privacy less of an afterthought by designing systems with technologies to protect the confidentiality and integrity of information in the first place.nnLynn Parker Dupree said DHS Secretary Alejandro Mayorkas \u201chas been very clear about the importance of addressing privacy up front, and has been very deliberate on including the privacy office early and often in policy discussions.\u201dnnThe privacy office has been at the center of discussions around how to protect the health of the DHS workforce during COVID-19, initiatives to combat domestic violent extremism and the use of emerging technologies like biometrics, Dupree said.nnDupree was appointed DHS chief privacy officer in March 2021 after a brief stint as director of governance and controls in the Data Ethics and Privacy Office at Capital One. Between 2014 and 2020, she held numerous positions at the Privacy and Civil Liberties Oversight Board, including as its Executive Director. She previously worked at the DHS privacy office during the Obama administration.nn\u201cOne of the things I would like to do is really begin to include privacy in technical designs,\u201d Dupree said in an interview. \u201cA lot of our privacy mitigations happen after a technology is developed. But I have been really working with academia and technologists to figure out how we can build tools that actually enhance privacy.\u201dnnThe privacy office will host a workshop this June to present privacy researchers with specific DHS use cases \u201cthat could be solved with privacy enhancing techniques,\u201d according to Dupree. She specifically mentioned cryptographic techniques and other \u201csecure computing methodologies\u201d as potential enhancements.nnThe DHS privacy office and privacy officers at DHS subcomponents review contracts for privacy requirements. She said her team is working with the DHS chief information officer, the agency\u2019s science and technology arm, and the procurement directorate on incorporating new privacy enhancements.nnShe acknowledged the \u201cunique\u201d concerns around the use of biometrics, especially tests that have shown facial recognition is less accurate at identifying persons of color. The office publishes Privacy Impact Assessments to provide notice of how DHS is identifying and mitigating privacy risks.nn\u201cWe will make sure that there are mechanisms in place to mitigate those risks,\u201d Dupree said regarding potential inaccuracies and biases in biometrics technologies.nnBut lawmakers remain concerned about agencies\u2019 use of facial recognition. In <a href="https:\/\/www.markey.senate.gov\/news\/press-releases\/senators-markey-and-merkley-and-reps-jayapal_pressley-urge-federal-agencies-to-end-use-of-clearview-ai-facial-recognition-technology">a Feb. 10 letter<\/a> to Mayorkas, five Democrats from the House and Senate requested that DHS components end their use of facial recognition products, including those provide by Clearview AI.nn\u201cUse of increasingly powerful technologies like Clearview AI\u2019s have the concerning potential to violate Americans\u2019 privacy rights and exacerbate existing injustices,\u201d the letter states.nnThe Secret Service, Immigration and Customs Enforcement and Customs and Border Protection have used Clearview AI\u2019s products, according to an August 2021 Government Accountability Office report.nn\u201cI think the department is always trying to be responsive to the needs of the Congress and the department will respond back through our Office of Legislative Affairs,\u201d Dupree said when asked about the letter.nnDupree also said she's focused on improving DHS's engagement with external privacy stakeholders, including advocates in academia and civil society. Last year, she organized a meeting between Mayorkas and privacy advocates.nn"While the advocacy community and the department may not always be aligned, these engagements really do bring a diversity of opinions into the discussion that ultimately serve to improve our decision making processes," she said.nnDupree said she's also looking to "reinvigorate" the DHS Data Privacy and Integrity Advisory Committee, which advises the agency on privacy and technology.nnThe committee is set to meet on Tuesday, Feb. 22, where it will provide "written guidance on best practices to ensure the effective implementation of privacy requirements for information sharing across the DHS enterprise," according to <a href="https:\/\/www.federalregister.gov\/documents\/2022\/02\/08\/2022-02571\/dhs-data-privacy-and-integrity-advisory-committee">a Federal Register notice.\u00a0<\/a>n<h2>FOIA progress<\/h2>nDHS also made a big dent in its Freedom of Information Act requests backlog last year. Dupree said the agency ended fiscal year 2021 with a backlog of 25,102 requests, a 10-year low for an agency that receives the most FOIA requests across government.nn\u201cWe're using some automation to help process routine requests so that we can focus our actual manpower on the complex requests,\u201d said Dupree, who is also DHS\u2019s Chief FOIA Officer.nnDHS received an average of about 250,000 requests annually since 2009, according to a<a href="https:\/\/www.dhs.gov\/sites\/default\/files\/publications\/final_dhs_backlog_reduction_plan_2020-2023_3.6.20.pdf"> DHS FOIA Backlog reduction plan released in March 2020.<\/a>nnThe majority of the backlog has been related to immigration records, with requests to CBP, ICE, U.S. Citizenship and Immigration Services, and the Office of Biometric Identity Management accounting for 92 percent of the backlog in FY-18, according to the reduction plan.nnDuring the pandemic, Dupree said her office has provided support to components like CBP, ICE and USCIS to help them process FOIA requests.nn\u201cWe just tried to increase efficiency as much as possible, and that shifting of resources really paid off,\u201d Dupree said."}};

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

The Department of Homeland Security’s chief privacy officer wants to make privacy less of an afterthought by designing systems with technologies to protect the confidentiality and integrity of information in the first place.

Lynn Parker Dupree said DHS Secretary Alejandro Mayorkas “has been very clear about the importance of addressing privacy up front, and has been very deliberate on including the privacy office early and often in policy discussions.”

The privacy office has been at the center of discussions around how to protect the health of the DHS workforce during COVID-19, initiatives to combat domestic violent extremism and the use of emerging technologies like biometrics, Dupree said.

Dupree was appointed DHS chief privacy officer in March 2021 after a brief stint as director of governance and controls in the Data Ethics and Privacy Office at Capital One. Between 2014 and 2020, she held numerous positions at the Privacy and Civil Liberties Oversight Board, including as its Executive Director. She previously worked at the DHS privacy office during the Obama administration.

“One of the things I would like to do is really begin to include privacy in technical designs,” Dupree said in an interview. “A lot of our privacy mitigations happen after a technology is developed. But I have been really working with academia and technologists to figure out how we can build tools that actually enhance privacy.”

The privacy office will host a workshop this June to present privacy researchers with specific DHS use cases “that could be solved with privacy enhancing techniques,” according to Dupree. She specifically mentioned cryptographic techniques and other “secure computing methodologies” as potential enhancements.

The DHS privacy office and privacy officers at DHS subcomponents review contracts for privacy requirements. She said her team is working with the DHS chief information officer, the agency’s science and technology arm, and the procurement directorate on incorporating new privacy enhancements.

She acknowledged the “unique” concerns around the use of biometrics, especially tests that have shown facial recognition is less accurate at identifying persons of color. The office publishes Privacy Impact Assessments to provide notice of how DHS is identifying and mitigating privacy risks.

“We will make sure that there are mechanisms in place to mitigate those risks,” Dupree said regarding potential inaccuracies and biases in biometrics technologies.

But lawmakers remain concerned about agencies’ use of facial recognition. In a Feb. 10 letter to Mayorkas, five Democrats from the House and Senate requested that DHS components end their use of facial recognition products, including those provide by Clearview AI.

“Use of increasingly powerful technologies like Clearview AI’s have the concerning potential to violate Americans’ privacy rights and exacerbate existing injustices,” the letter states.

The Secret Service, Immigration and Customs Enforcement and Customs and Border Protection have used Clearview AI’s products, according to an August 2021 Government Accountability Office report.

“I think the department is always trying to be responsive to the needs of the Congress and the department will respond back through our Office of Legislative Affairs,” Dupree said when asked about the letter.

Dupree also said she’s focused on improving DHS’s engagement with external privacy stakeholders, including advocates in academia and civil society. Last year, she organized a meeting between Mayorkas and privacy advocates.

“While the advocacy community and the department may not always be aligned, these engagements really do bring a diversity of opinions into the discussion that ultimately serve to improve our decision making processes,” she said.

Dupree said she’s also looking to “reinvigorate” the DHS Data Privacy and Integrity Advisory Committee, which advises the agency on privacy and technology.

The committee is set to meet on Tuesday, Feb. 22, where it will provide “written guidance on best practices to ensure the effective implementation of privacy requirements for information sharing across the DHS enterprise,” according to a Federal Register notice. 

FOIA progress

DHS also made a big dent in its Freedom of Information Act requests backlog last year. Dupree said the agency ended fiscal year 2021 with a backlog of 25,102 requests, a 10-year low for an agency that receives the most FOIA requests across government.

“We’re using some automation to help process routine requests so that we can focus our actual manpower on the complex requests,” said Dupree, who is also DHS’s Chief FOIA Officer.

DHS received an average of about 250,000 requests annually since 2009, according to a DHS FOIA Backlog reduction plan released in March 2020.

The majority of the backlog has been related to immigration records, with requests to CBP, ICE, U.S. Citizenship and Immigration Services, and the Office of Biometric Identity Management accounting for 92 percent of the backlog in FY-18, according to the reduction plan.

During the pandemic, Dupree said her office has provided support to components like CBP, ICE and USCIS to help them process FOIA requests.

“We just tried to increase efficiency as much as possible, and that shifting of resources really paid off,” Dupree said.

]]>
https://federalnewsnetwork.com/agency-oversight/2022/02/dhs-privacy-chief-aims-to-promote-privacy-enhancing-technologies/feed/ 0
Library of Congress expanded online content in response to the pandemic https://federalnewsnetwork.com/open-datatransparency/2022/02/library-of-congress-expanded-online-content-in-response-to-the-pandemic/ https://federalnewsnetwork.com/open-datatransparency/2022/02/library-of-congress-expanded-online-content-in-response-to-the-pandemic/#respond Mon, 14 Feb 2022 17:23:34 +0000 https://federalnewsnetwork.com/?p=3908590 var config_3908776 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/dts.podtrac.com\/redirect.mp3\/pdst.fm\/e\/chrt.fm\/track\/E2G895\/aw.noxsolutions.com\/launchpod\/federal-drive\/mp3\/021422_Darby_Joshi_web_m9z7_84ec5cbc.mp3?awCollectionId=1146&awEpisodeId=dd11966e-65c6-425c-887f-c8ec84ec5cbc&awNetwork=322"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2018\/12\/FD1500-150x150.jpg","title":"How the Library of Congress expanded its online content in response to the pandemic","description":"[hbidcpodcast podcastid='3908776']nn<em>Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive\u2019s daily audio interviews on\u00a0<\/em><a href="https:\/\/itunes.apple.com\/us\/podcast\/federal-drive-with-tom-temin\/id1270799277?mt=2">Apple Podcasts<\/a><em>\u00a0or\u00a0<a href="https:\/\/www.podcastone.com\/federal-drive-with-tom-temin?pid=1753589">PodcastOne<\/a>.<\/em>nnThe Library of Congress has been busy building an online collection of what are known as open access e-books. The <a href="https:\/\/blogs.loc.gov\/thesignal\/2021\/12\/open-access-books-collection\/">effort accelerated<\/a> when the pandemic hit and people had more access to online books than to physical libraries. For more on this effort, the Digital Collections Development Coordinator Rashi Joshi, and Digital Collections Specialist Kristy Darby spoke to the\u00a0<a href="https:\/\/federalnewsnetwork.com\/category\/temin\/tom-temin-federal-drive\/"><em><strong>Federal Drive with Tom Temin.<\/strong><\/em><\/a>nn<em>Interview transcript:<\/em>n<blockquote><strong>Tom Temin:<\/strong> First of all, tell us what is an open access e-book? These are titles that you might not find in your, top of the list, say, at Amazon.nn<strong>Rashi Joshi: <\/strong>Sure. So an open access e-book is an e-book that lives on the web. Its open access content is material that's licensed for free and open use and redistribution. So this could be a work under an open license like Creative Commons. This is a content creator and the publishing their works under an open license, which allows the library to acquire the content much more easily and redistribute it widely on our website, LOC.gov.nn<strong>Tom Temin:<\/strong> So I'm guessing this is mostly nonfiction type work, or academic and peer reviewed types of material, Kristy?nn<strong>Kristy Darby:\u00a0<\/strong>That's right. So they are high quality, they're peer reviewed titles, they span myriad subjects. We have books on history, philosophy, music, life sciences, mathematics, religion, economics, so most of them are nonfiction. We do have about 50 works of fiction in the collection. Some of these are new editions of classics that have been republished. And we also have some contemporary fiction in there. So there are authors who are publishing their fiction works under these licenses.nn<strong>Tom Temin:<\/strong> And what about, you mentioned the classics, just out of curiosity? Is there a point in history at which some of the classics become public domain, like Don Quixote, for example?nn<strong>Rashi Joshi:\u00a0<\/strong>Yes, so when an item falls out of copyright, if it's not protected by copyright law, then it becomes public domain. A lot of U.S. government works are in the public domain. So we have posts of acquisition specialists that are trained in identifying works that fall under the public domain. And those are in scope for this open access books collection.nn<strong>Tom Temin:<\/strong> And in some ways, that's a greater responsibility than something that has been published contemporary, because you want to make sure that the text of a Don Quixote, which I read decades ago, is preserved in a way that is sacrosanct that someone can't kind of reissue it under some crazy label, and change the text for whatever political bias they might want to introduce.nn<strong>Rashi Joshi:\u00a0<\/strong>Sure, so everything that's going into this open access books collection is being reviewed by acquisition specialists to make sure that the terms of redistribution are appropriate, and that we can provide wide access to the content on our website.nn<strong>Tom Temin:<\/strong> Now, the library also had a link to the catalog of these public domain e-books. So what is the value added of the library publishing these in a collection available to the public versus just going to that catalog?nn<strong>Kristy Darby:\u00a0<\/strong>That's a good question. We think about it in terms of enduring access. So when we take these files, we put them into manage storage, we have them available in whatever digital perpetuity means. They're always available. And then we have our catalog records that we add these links, they're persistent links, so they won't change. So anybody who has Library of Congress records will have the link to this content, it will always be there. And it will always be available. So the web is a shifting changing thing. We don't always know where things are going to end up. But we feel very comfortable that we are providing that enduring access for this content.nn<strong>Rashi Joshi:\u00a0<\/strong>Yeah, and there's so much unique and high research value and ephemeral open access content on the web. By acquiring the files for this content and hosting them on library platforms, we are making a commitment to preserving and providing enduring access to this content to the American public.nn<strong>Tom Temin:<\/strong> We're speaking with Rashi Joshi, she's a Digital Collections development coordinator, and with Kristy Darby, Digital Collections specialist, both in the digital content management section at the Library of Congress. And what does it require? What kind of effort is needed? Is it simply transferring the file from the catalog and putting it under an LOC URL or does it take more than that?nn<strong>Kristy Darby:\u00a0<\/strong>It's a surprising amount of work behind the scenes. So we definitely have to acquire those files. We have to create thumbnails so on our website, people will be able to see at a glance what we have. We work on those catalog records with our catalogers specialists in the Acquisitions and Bibliographic Access Directorate. We work with Rashi's division, so it takes a lot of coordination across the Library, a lot of people doing their part to bring it online. We work with our Office of the Chief Information Officer to support our technical infrastructure. So it's a lot of moving parts, but we have worked over the past several years to build this workflow from the ground up. And then over the course of the pandemic, we've really been able to kind of refine it and now we kind of have it, it's a well-oiled machine.nn<strong>Tom Temin:<\/strong> Got it, and is there any vetting of the material? I mean, suppose someone got something into the catalog that really is false or known to be contrary to, I don't know, say that the world has dipped in ether or something like that. I mean, there's still people that believe there's ether out in space. You wouldn't probably want to support that, or do you just put it all on and let people make their own judgment?nn<strong>Rashi Joshi:\u00a0<\/strong>So we have a very broad collecting mission. The Library's mission is to engage, inspire and inform Congress and the American people with a universal and enduring source of knowledge and creativity. So this is a broad collecting mission. We cover all subjects except for clinical medicine, and technical agriculture, which are covered by the National Library of Medicine, and the USDA National Agricultural Library. So as I mentioned, the collection mission is broad. The collection is universal in terms of subject, but it's not comprehensive. So to help our subject matter experts do selection of content, there's a lot more content out there than we can collect. We have something we call collection policy statements, we have over 70 of these. These are subject- and format-focused. So when a subject expert is assessing content for potential inclusion in our permanent collections, they're referring to these collection policy statements. These are collaboratively developed by subject experts. They're revised periodically, and they're all available on our website for the public to access.nn<strong>Tom Temin:<\/strong> And what has been the take up of this collection so far? I mean, how many people do you measure the success of whether anyone's reading this stuffnn<strong>Kristy Darby:\u00a0<\/strong>We do, every month we get a report to see what has been downloaded, what has been viewed, where the folks are who are viewing it. And it's always really exciting to see. It's been growing every month. Right now we have about 10,000 downloads a month. And the users, we have lots of users from the United States, of course, but we also have lots of users from Western Europe, we have lots of users in South America, lots of users in India. So it's been really interesting to see sort of how this has grown. Every month, it seems we get about 1,000 more viewers, 1,000 more users. So we've been tracking that over time, we'll continue to do that.nn<strong>Tom Temin:<\/strong> Do you get downloads, say to China or Russia or North Korea?nn<strong>Kristy Darby:\u00a0<\/strong>We definitely get them to China and Russia? I don't know that I have seen North Korea on our list. But yeah, absolutely. They are downloading our content.nn<strong>Tom Temin:<\/strong> And if there are 10,000 downloads in a month, is it 9,000 of one title? Or is it kind of across the board at the appeal of the collection?nn<strong>Kristy Darby:\u00a0<\/strong>It's a little bit across the board. We always have a top five, the ones that really rise to the top. Very often they will be educational titles, we can tell that it looks as if a lot of educators have been hitting the collection, especially over the pandemic, which is really exciting. We also have, as part of this collection, a collection of children's books from South Africa. So these are born digital children's books, they were not digitized. And they were totally created online. And those get a lot of use too which is really exciting because it sort of points to this collection being for everyone. It's not just career scholars who are using this, teachers, parents, children are also using this so that's always really exciting.nn<strong>Tom Temin:<\/strong> If someone wanted to create a book digitally to put in the domain, does it have to be in the catalog from what you drew it? Or can you send it directly to the Library of Congress?nn<strong>Rashi Joshi:\u00a0<\/strong>Certainly not. So we started by looking at the Directory of Open Access books, because it was a large repository of peer reviewed academic open access books. And certainly not all books in the directory are in scope for our collection. So we are aiming big, we're not only looking at specific repositories, but any open access and openly available e-books on the web that are in scope for collecting as per our cache policy statements.nn<strong>Tom Temin:<\/strong> So someone that wants to get those 10,000 download access, they need to first check out the policy to see if it's even something you'll accept.nn<strong>Rashi Joshi:\u00a0<\/strong>Yes, so there is a donations form on the Library's website. Public can also connect with a reference librarian. So the reference librarian is the real subject matter expert who will be assessing content to see if it's in scope for the permanent collection or not.nn<strong>Tom Temin:<\/strong> And I'm just wondering, do you have a sense of how the publishing landscape is changing? I mean, there have always been self-published books, since there were books as opposed to the trade titles, the Knopfs and so forth of the world. Is more and more coming into the reading public via not the standard publishers that have their editors that vet and have their policies, but this kind of new way of self publishing, that's also not print, and also not the famous publishers?nn<strong>Rashi Joshi:\u00a0<\/strong>Sure, the landscape of ebooks and publishers is continuously evolving. And we are new to collecting open access e-books. So we are learning as we go. So there's a whole diversity of publishers represented. And right now we have about 3,400 open access e-books in this collection in 50 languages, 100 countries of publication represented. We expect these numbers to only keep growing and not in terms of just the languages and countries of publication represented but the types of publishers, individual content creators and other types of publishers.nn<strong>Tom Temin:<\/strong> And on the just technical front, do the formats that you have allow for mobile reading and mobile device reading?nn<strong>Kristy Darby:\u00a0<\/strong>Yes, absolutely. So we have PDFs for a lot. So a PDF is a very common format, and e-pubs as well, which is an increasingly common format. It's very common for e-books. And those can be downloaded. You can read them right on the Library's website, but anybody is free to download those and then read them at their leisure on their e-book reader, on their phone on their computer. All of those are available for download.nn<strong>Tom Temin:<\/strong> And what about the accessibility questions? Is there a red verbal version available? Or is that a technology you're thinking about?nn<strong>Kristy Darby:\u00a0<\/strong>It is and e-pubs are a great way because those are essentially kind of big HTML files. So you can read the text within it. A PDF is really an image. So we do have a growing number of e-pubs available.nn<strong>Tom Temin:<\/strong> Because I was thinking for South African children's novels, you could probably get famous actors to read them gratis, versus having some sort of robotic voice generation read them.nn<strong>Kristy Darby:\u00a0<\/strong>That would be lovely.nn<strong>Tom Temin:<\/strong> There's a free idea for you, see what Hollywood thinks about that. And a final tech question, is all of this searchable and keyword findable? And because 3,400 and that catalog has thousands and thousands more and so there's really no limit to how big this can get.nn<strong>Kristy Darby:\u00a0<\/strong>One thing that we pride ourselves on at the Library of Congress is really great bibliographic description. And these books are cataloged. They are available with subject headings, and they're fully searchable in the catalog and on the website. So people can search them by title by subject by author by publisher, and they're integrated into the Library's catalog as well. So they're there with everything else.nn<strong>Tom Temin:<\/strong> Well, I'm going to check them out myself. Kristy Darby is Digital Collections specialist and Rashi Joshi is Digital Collections development coordinator, both in the digital content management section at the Library of Congress. Thanks so much for joining me.nn<strong>Rashi Joshi:\u00a0<\/strong>Thank you so much. It's been a pleasure.nn<strong>Kristy Darby: <\/strong>Thanks so much, Tom.<\/blockquote>"}};

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

The Library of Congress has been busy building an online collection of what are known as open access e-books. The effort accelerated when the pandemic hit and people had more access to online books than to physical libraries. For more on this effort, the Digital Collections Development Coordinator Rashi Joshi, and Digital Collections Specialist Kristy Darby spoke to the Federal Drive with Tom Temin.

Interview transcript:

Tom Temin: First of all, tell us what is an open access e-book? These are titles that you might not find in your, top of the list, say, at Amazon.

Rashi Joshi: Sure. So an open access e-book is an e-book that lives on the web. Its open access content is material that’s licensed for free and open use and redistribution. So this could be a work under an open license like Creative Commons. This is a content creator and the publishing their works under an open license, which allows the library to acquire the content much more easily and redistribute it widely on our website, LOC.gov.

Tom Temin: So I’m guessing this is mostly nonfiction type work, or academic and peer reviewed types of material, Kristy?

Kristy Darby: That’s right. So they are high quality, they’re peer reviewed titles, they span myriad subjects. We have books on history, philosophy, music, life sciences, mathematics, religion, economics, so most of them are nonfiction. We do have about 50 works of fiction in the collection. Some of these are new editions of classics that have been republished. And we also have some contemporary fiction in there. So there are authors who are publishing their fiction works under these licenses.

Tom Temin: And what about, you mentioned the classics, just out of curiosity? Is there a point in history at which some of the classics become public domain, like Don Quixote, for example?

Rashi Joshi: Yes, so when an item falls out of copyright, if it’s not protected by copyright law, then it becomes public domain. A lot of U.S. government works are in the public domain. So we have posts of acquisition specialists that are trained in identifying works that fall under the public domain. And those are in scope for this open access books collection.

Tom Temin: And in some ways, that’s a greater responsibility than something that has been published contemporary, because you want to make sure that the text of a Don Quixote, which I read decades ago, is preserved in a way that is sacrosanct that someone can’t kind of reissue it under some crazy label, and change the text for whatever political bias they might want to introduce.

Rashi Joshi: Sure, so everything that’s going into this open access books collection is being reviewed by acquisition specialists to make sure that the terms of redistribution are appropriate, and that we can provide wide access to the content on our website.

Tom Temin: Now, the library also had a link to the catalog of these public domain e-books. So what is the value added of the library publishing these in a collection available to the public versus just going to that catalog?

Kristy Darby: That’s a good question. We think about it in terms of enduring access. So when we take these files, we put them into manage storage, we have them available in whatever digital perpetuity means. They’re always available. And then we have our catalog records that we add these links, they’re persistent links, so they won’t change. So anybody who has Library of Congress records will have the link to this content, it will always be there. And it will always be available. So the web is a shifting changing thing. We don’t always know where things are going to end up. But we feel very comfortable that we are providing that enduring access for this content.

Rashi Joshi: Yeah, and there’s so much unique and high research value and ephemeral open access content on the web. By acquiring the files for this content and hosting them on library platforms, we are making a commitment to preserving and providing enduring access to this content to the American public.

Tom Temin: We’re speaking with Rashi Joshi, she’s a Digital Collections development coordinator, and with Kristy Darby, Digital Collections specialist, both in the digital content management section at the Library of Congress. And what does it require? What kind of effort is needed? Is it simply transferring the file from the catalog and putting it under an LOC URL or does it take more than that?

Kristy Darby: It’s a surprising amount of work behind the scenes. So we definitely have to acquire those files. We have to create thumbnails so on our website, people will be able to see at a glance what we have. We work on those catalog records with our catalogers specialists in the Acquisitions and Bibliographic Access Directorate. We work with Rashi’s division, so it takes a lot of coordination across the Library, a lot of people doing their part to bring it online. We work with our Office of the Chief Information Officer to support our technical infrastructure. So it’s a lot of moving parts, but we have worked over the past several years to build this workflow from the ground up. And then over the course of the pandemic, we’ve really been able to kind of refine it and now we kind of have it, it’s a well-oiled machine.

Tom Temin: Got it, and is there any vetting of the material? I mean, suppose someone got something into the catalog that really is false or known to be contrary to, I don’t know, say that the world has dipped in ether or something like that. I mean, there’s still people that believe there’s ether out in space. You wouldn’t probably want to support that, or do you just put it all on and let people make their own judgment?

Rashi Joshi: So we have a very broad collecting mission. The Library’s mission is to engage, inspire and inform Congress and the American people with a universal and enduring source of knowledge and creativity. So this is a broad collecting mission. We cover all subjects except for clinical medicine, and technical agriculture, which are covered by the National Library of Medicine, and the USDA National Agricultural Library. So as I mentioned, the collection mission is broad. The collection is universal in terms of subject, but it’s not comprehensive. So to help our subject matter experts do selection of content, there’s a lot more content out there than we can collect. We have something we call collection policy statements, we have over 70 of these. These are subject- and format-focused. So when a subject expert is assessing content for potential inclusion in our permanent collections, they’re referring to these collection policy statements. These are collaboratively developed by subject experts. They’re revised periodically, and they’re all available on our website for the public to access.

Tom Temin: And what has been the take up of this collection so far? I mean, how many people do you measure the success of whether anyone’s reading this stuff

Kristy Darby: We do, every month we get a report to see what has been downloaded, what has been viewed, where the folks are who are viewing it. And it’s always really exciting to see. It’s been growing every month. Right now we have about 10,000 downloads a month. And the users, we have lots of users from the United States, of course, but we also have lots of users from Western Europe, we have lots of users in South America, lots of users in India. So it’s been really interesting to see sort of how this has grown. Every month, it seems we get about 1,000 more viewers, 1,000 more users. So we’ve been tracking that over time, we’ll continue to do that.

Tom Temin: Do you get downloads, say to China or Russia or North Korea?

Kristy Darby: We definitely get them to China and Russia? I don’t know that I have seen North Korea on our list. But yeah, absolutely. They are downloading our content.

Tom Temin: And if there are 10,000 downloads in a month, is it 9,000 of one title? Or is it kind of across the board at the appeal of the collection?

Kristy Darby: It’s a little bit across the board. We always have a top five, the ones that really rise to the top. Very often they will be educational titles, we can tell that it looks as if a lot of educators have been hitting the collection, especially over the pandemic, which is really exciting. We also have, as part of this collection, a collection of children’s books from South Africa. So these are born digital children’s books, they were not digitized. And they were totally created online. And those get a lot of use too which is really exciting because it sort of points to this collection being for everyone. It’s not just career scholars who are using this, teachers, parents, children are also using this so that’s always really exciting.

Tom Temin: If someone wanted to create a book digitally to put in the domain, does it have to be in the catalog from what you drew it? Or can you send it directly to the Library of Congress?

Rashi Joshi: Certainly not. So we started by looking at the Directory of Open Access books, because it was a large repository of peer reviewed academic open access books. And certainly not all books in the directory are in scope for our collection. So we are aiming big, we’re not only looking at specific repositories, but any open access and openly available e-books on the web that are in scope for collecting as per our cache policy statements.

Tom Temin: So someone that wants to get those 10,000 download access, they need to first check out the policy to see if it’s even something you’ll accept.

Rashi Joshi: Yes, so there is a donations form on the Library’s website. Public can also connect with a reference librarian. So the reference librarian is the real subject matter expert who will be assessing content to see if it’s in scope for the permanent collection or not.

Tom Temin: And I’m just wondering, do you have a sense of how the publishing landscape is changing? I mean, there have always been self-published books, since there were books as opposed to the trade titles, the Knopfs and so forth of the world. Is more and more coming into the reading public via not the standard publishers that have their editors that vet and have their policies, but this kind of new way of self publishing, that’s also not print, and also not the famous publishers?

Rashi Joshi: Sure, the landscape of ebooks and publishers is continuously evolving. And we are new to collecting open access e-books. So we are learning as we go. So there’s a whole diversity of publishers represented. And right now we have about 3,400 open access e-books in this collection in 50 languages, 100 countries of publication represented. We expect these numbers to only keep growing and not in terms of just the languages and countries of publication represented but the types of publishers, individual content creators and other types of publishers.

Tom Temin: And on the just technical front, do the formats that you have allow for mobile reading and mobile device reading?

Kristy Darby: Yes, absolutely. So we have PDFs for a lot. So a PDF is a very common format, and e-pubs as well, which is an increasingly common format. It’s very common for e-books. And those can be downloaded. You can read them right on the Library’s website, but anybody is free to download those and then read them at their leisure on their e-book reader, on their phone on their computer. All of those are available for download.

Tom Temin: And what about the accessibility questions? Is there a red verbal version available? Or is that a technology you’re thinking about?

Kristy Darby: It is and e-pubs are a great way because those are essentially kind of big HTML files. So you can read the text within it. A PDF is really an image. So we do have a growing number of e-pubs available.

Tom Temin: Because I was thinking for South African children’s novels, you could probably get famous actors to read them gratis, versus having some sort of robotic voice generation read them.

Kristy Darby: That would be lovely.

Tom Temin: There’s a free idea for you, see what Hollywood thinks about that. And a final tech question, is all of this searchable and keyword findable? And because 3,400 and that catalog has thousands and thousands more and so there’s really no limit to how big this can get.

Kristy Darby: One thing that we pride ourselves on at the Library of Congress is really great bibliographic description. And these books are cataloged. They are available with subject headings, and they’re fully searchable in the catalog and on the website. So people can search them by title by subject by author by publisher, and they’re integrated into the Library’s catalog as well. So they’re there with everything else.

Tom Temin: Well, I’m going to check them out myself. Kristy Darby is Digital Collections specialist and Rashi Joshi is Digital Collections development coordinator, both in the digital content management section at the Library of Congress. Thanks so much for joining me.

Rashi Joshi: Thank you so much. It’s been a pleasure.

Kristy Darby: Thanks so much, Tom.

]]>
https://federalnewsnetwork.com/open-datatransparency/2022/02/library-of-congress-expanded-online-content-in-response-to-the-pandemic/feed/ 0
IRS walks away from facial recognition to access online tools after backlash https://federalnewsnetwork.com/it-modernization/2022/02/irs-walks-away-from-facial-recognition-to-access-online-tools-following-backlash/ https://federalnewsnetwork.com/it-modernization/2022/02/irs-walks-away-from-facial-recognition-to-access-online-tools-following-backlash/#respond Mon, 07 Feb 2022 21:46:52 +0000 https://federalnewsnetwork.com/?p=3896367 The IRS will transition away from using facial recognition technology to help taxpayers create online accounts with the agency.

The IRS announced Monday that it will “quickly develop and bring online an additional authentication process that does not involve facial recognition,” in order for taxpayers to access self-help services on the agency’s website.

“The IRS will also continue to work with its cross-government partners to develop authentication methods that protect taxpayer data and ensure broad access to online tools,” the agency said in a statement.

The IRS changed course after recent criticism from Congress and associations who took issue with the agency’s partnership with the private facial recognition service ID.me.

The agency launched the new identity verification process in November, which required taxpayers to sign in with an ID.me account, or create one.

The process required taxpayers to provide a photo of a government-issued document, including a driver’s license, state ID or passport, then take a “selfie” with a smartphone or computer webcam.

Once verified, taxpayers could access IRS online services, such as the child tax credit portal, their online account or the agency’s “Get Transcript Online” feature. Taxpayers, once logged in, could also request an Identity Protection PIN or access an Online Payment Agreement.

The IRS said would allow taxpayers with existing accounts under the old sign-in process to continue using those credentials through summer 2022.

Senate Finance Committee Chairman Ron Wyden (D-Ore.) said Monday that the Treasury Department told him it was in the process of having the IRS transition away from using ID.me to verify IRS.gov accounts.

Wyden sent a letter earlier Monday, urging the agency to reconsider its use of facial recognition technology, and consider using Login.gov, a federal identity verification service already used by 40 million Americans for 200 websites from 28 agencies.

Wyden called Treasury and the IRS’ decision to walk away from facial recognition technology a “smart move.”

“I understand the transition process may take time, but I appreciate that the administration recognizes that privacy and security are not mutually exclusive and no one should be forced to submit to facial recognition to access critical government services,” Wyden said.

Last week, lawmakers repeatedly pressed the IRS for more details about its use of facial technology, or demanded the agency pull the plug on its use of these tools.

Last Thursday, 15 Senate Republicans on the Senate Finance Committee told the IRS they were deeply concerned over its partnership with ID.me, noting that federal agencies, including the IRS, “have an unfortunate history of data breaches.”

Sens. Roy Blunt (R-Mo.) and Jeff Merkley (D-Ore.) sent a letter last Thursday urging the IRS to “immediately discontinue” all programs that use any type of biometric data to identify taxpayers.

Senate Commerce, Science, and Transportation Committee Ranking Member Roger Wicker (R-Miss.) also sent a letter to the IRS, asking the agency for more details on how it would protect this sensitive biometric data, and whether it consulted with the Federal Trade Commission or the National Institute of Standards and Technology launching this partnership.

The IT Acquisition Advisory Council, a group of senior executive professionals from government and industry, warned that requiring a selfie from taxpayers to get help online raises privacy concerns, and would create a disadvantage for taxpayers who don’t have access to a smartphone, or who don’t feel comfortable giving their biometric information to a third-party vendor to access government services.

“Not only does this decision create an unprecedented personal privacy issue, but it also allows for the potential of personally identifiable information being harvested and then utilized for commercial for-profit purposes,” the association wrote.

The association also raised concerns about the reliability and dependability of facial recognition technology to verify someone’s identity online. It also urged the IRS to take advantage of Login.gov, an identity verification platform already used by several agencies.

“The public deserves answers to this legitimate question particularly when login.gov is a viable option currently being utilized and a platform where the government is the steward of the personal information that citizens provide as opposed to a 3rd party commercial for-profit company,” the association wrote.

Reps. Ted Lieu (D-Calif.), Yvette Clarke (D-N.Y.),  Pramila Jayapal (D-Wash.) and Anna Eshoo (D-Calif.), in their own letter to the IRS, took issue with the vendor’s inconsistent claims on whether it used “one-to-many face recognition,” which compares a facial image to many other similar images in a database.

“Given these issues, it is simply wrong to compel millions of Americans to place trust in this new protocol,” the lawmakers wrote.

An ID.me spokesperson referred to the IRS for “any questions on this issue,” and a Treasury Department spokesperson referred to the IRS’ statement on Monday.

]]>
https://federalnewsnetwork.com/it-modernization/2022/02/irs-walks-away-from-facial-recognition-to-access-online-tools-following-backlash/feed/ 0