White House calls for consistent rules for disclosing foreign research funding | Science

President Joe Biden’s administration last week ordered federal agencies to draft uniform policies describing the outside sources of funding that scientists must disclose when they apply for federal grants, and the penalties for failing to do so. Research groups welcome the directive, but wish it had also specified what kinds of foreign collaborations might get a scientist in trouble.  

The new directive, issued on 4 January by the White House Office of Science and Technology Policy (OSTP), feeds into a roiling political debate about how to protect federally funded research from attempted theft by some foreign governments. In recent years, the federal government has prosecuted some two dozen academics for failing to disclose financial ties to China, which critics say has criminalized minor violations of often confusing federal rules and chilled research collaborations.

The 34-page OSTP memo fleshes out a proposal to improve research security issued 1 year ago, in the final days of then-President Donald Trump’s administration, as well as a recent congressional mandate with the same goal. Academic research administrators expect it to help clarify the responsibilities of faculty and their institutions, which officially are the recipients of any federal grant.

“This will provide consistent guidance to those [within government agencies] writing the rules,” says Mary Millsaps, director of research compliance at North Carolina State University. “And that’s been missing.”

Unfortunately, according to university administrators, what is still missing is any guidance on the specific research affiliations that pose a risk to national security and might prevent a scientist from obtaining federal funding. Having a list of so-called bad actors researchers should avoid, they say, would help them in advising faculty who are applying for federal grants.

Not having such information could put “undue, vague, and implicit pressures on researchers,” OSTP Director Eric Lander concedes in an introduction to the memo. That uncertainty, he adds, could “create a chilling atmosphere that would only constrain and damage the U.S. scientific enterprise.”

But Lander promises that “OSTP intends to address [such questions] in the future.” And one federal agency already has. Although the OSTP report doesn’t mention its efforts, the Defense Advanced Research Projects Agency has recently developed and begun to use a risk matrix to vet its top-ranked proposals before making an award (see sidebar, below).

Search for consensus

In addition to laying out what needs to be disclosed, the OSTP guidance covers two other elements of research security. It suggests agencies think about giving every federally funded scientist a digital personal identifier that would reduce the chances of misidentification and make scientists easier to track. It would be accompanied by an electric CV that would be accessible by every agency. The directive also tells agencies to require that institutions provide research security training for faculty and relevant staff, teaching them about everything from the threats posed by cyberattacks to the perils of taking their laptop when traveling abroad.

The guidance also tells agencies to find ways to share information on both violations and “potential violations” of disclosure requirements. The second category raises significant privacy concerns for attorney Audrey Anderson, a former university general counsel now at Bass, Berry & Sims, a law firm. “If an agency later finds [a ‘potential violation’] was not a violation, will they also share that information?” she asks. “An allegation jeopardizes the reputations of the university and the scientist,” she adds, even if they are later found blameless.

But the report’s main thrust is on the need to reach consensus on what must be disclosed when applying for a federal research grant. Lander is asking an interagency group within the White House “to develop model grant application forms and instructions that can be used (and adapted where required) by any federal research funding agency.” And he wants to see those products by early May.

Whatever surfaces is likely to hew closely to what the National Institutes of Health (NIH) and the National Science Foundation (NSF) already require from the institutions they fund. Although their disclosure rules are not identical—they differ in what types of consulting activities, student mentoring, and honoraria need to be disclosed, for example—the two agencies are far ahead of the rest of the federal government in addressing these issues. And the new guidance is intended to prod other research funding agencies into action.

“My guess is that NIH and NSF [officials] will try to harmonize their rules, and then get the other agencies to go along,” says one expert who closely monitors federal regulatory policies for research. “I don’t expect them to go backwards.”

But others fear that is exactly what might happen. “Will there be a fight over which agency gets to be the least common denominator in setting standards for disclosure?” one research lobbyist wonders.

The guidance gives agencies the leeway to tailor their rules to conform to existing congressional mandates, including rules governing the oversight of sensitive technologies. It also allows for variations based on “other compelling reasons.”

That could be a huge loophole and potential stumbling block to developing uniform policies, observers say. “Agencies are always going to insist on having flexibility,” Millsaps says. “But the question is, will they conform to the spirit of the guidance, or will they go rogue?”

Anderson adds another note of caution for those hoping for greater clarify at the 120-day deadline Lander has set. “It’s one thing for the White House to say it, but it’s another thing to do it,” she says. “They have provided agencies with a template and instructions. But will agencies actually implement the guidance?”

Related story

DARPA adopts risk rubric to judge grant applicants

By Jeffrey Mervis

Scientific merit still comes first. But the Defense Advanced Research Projects Agency (DARPA) has also begun to weigh the potential risk to national security of funding a scientist who has ties to foreign governments.

The move makes DARPA, a $3.5-billion-a-year research unit within the Department of Defense, the first federal research agency to spell out how it will use the information on funding sources that grant applicants are required to disclose. That issue is not addressed in this week’s White House call for federal agencies to standardize their disclosure policies (see main story, above).

Top-ranked DARPA proposals are now getting a second vetting based on a newly developed “risk rubric.” It requires program managers to rate key scientists involved in the proposed research on a four-point scale, running from very high to low risk, based on research support they receive from foreign countries or entities. It asks, for example, whether applicants have participated in a foreign talent recruitment program or worked with a so-called “denied entity”—a person, company, or institution the U.S. government has flagged because of security concerns. Scientists would fall into a less risky category if they belong to a foreign talent program run by a U.S. ally rather than one run by China (although China is not named).

Proposals involving scientists who score as “very high” or “high” risk may need to be revised, DARPA Director Stefanie Tompkins explained in a 17 September 2021 memo outlining the new approach. The remedies could include closer monitoring of visitors to the funded lab or assigning another scientist as principal investigator. DARPA officials will then decide whether that “mitigation plan” addresses the potential threat, the memo says, or whether “to accept the risk” and make the award anyway.

DARPA began to work on the rubric nearly 2 years ago in response to a congressional mandate to tighten its oversight of researchers potentially vulnerable to foreign influences. But the initial version, released in September 2021 with Tompkins’s memo, didn’t go over very well with the research community. That’s because it included an assessment of a researcher’s personal relationship to foreign entities—including “family, friends, professional and financial” ties.

The mention of family and friends sparked concerns that disclosures could lead to racial profiling, says Deborah Altenburg, a senior official at the Association of Public and Land-grant Universities. Higher education lobbyists flagged the problem during meetings Altenburg had arranged with DARPA officials, she says, and DARPA was quick to drop that language in the new version posted last month.

The problematic language “should not have been included. … We missed it,” says Kevin Flaherty, a senior policy manager at DARPA who helped craft the rubric. The agency belatedly realized it didn’t want—or need—to assess a researcher’s personal ties to another country, he adds.

The Department of Energy (DOE) has also developed a risk matrix that applies to the work of researchers at its national laboratories. But it flags proposals according to specific technologies of concern, rather than examining the foreign ties of the leading scientists on the project. (DOE rules bar its scientists from participating in talent recruitment programs from China, Russia, Iran, and North Korea.)

The DARPA rubric applies to fundamental research that is not classified. That could make it a model for the government’s two largest civilian research agencies, the National Institutes of Health (NIH) and the National Science Foundation (NSF), says one expert on research compliance who requested anonymity to speak freely about federal policies. “DARPA has gone farther in identifying which contacts could be considered riskier,” the expert notes. “So it could be adopted by NIH or NSF.”