New Guidelines Aim to Improve
“Best Available Science”

by Laurie Schreiber

“I’m thinking that, as we move forward, we might want to reflect on those recommendations to improve the peer review process. I wonder, now, to what extent are we going to be constrained by this new guidance such that we won’t be able to go in the direction suggested.” – David Pierce  © Photo by Sam Murfitt

DANVERS, Mass. – The New England Fishery Management Council (NEMFC) heard a presentation, at its December meeting, on revisions that were made to federal fishery guidelines that support the use of “best available science.” The revisions aim to improve fishery management nationwide.

National Standard 2 says, “conservation and management measures shall be based upon the best scientific information available.” National Standard 2 is one of 10 standards established by the Magnuson-Stevens Fishery Conservation and Management Act (MSA), to guide the preparation of fishery management plans. In 2007, the MSA reauthorization act added provisions to improve the use of science in decision-making.

“More specifically, Congress called for the Scientific and Statistical Committees (SSC) associated with each regional fishery management council to have a stronger role in reviewing scientific information and providing input on fishing level recommendations,” according to the National Oceanic and Atmospheric Administration (NOAA). “Also, the revisions specified that the Secretary of Commerce and councils must establish a peer review process for scientific information used to advise councils on the conservation and management of fisheries. 

To address the changes, NOAA’s National Marine Fisheries Service (NMFS) revised the National Standard 2 guidelines to include: guidance on what constitutes best scientific information available; standards for scientific peer review; clarification on the role of the SSC in reviewing scientific information for the councils; expansion and clarification on the contents and purpose of Stock Assessment and Fishery Evaluation (SAFE) reports and related documents.

NOAA finalized the revisions to National Standard 2 in July 2013.

Dr. James Weinberg gave the presentation on National Standard 2. Weinberg is chairman of what is referred to as “SAW/SARC” – the Northeast Regional Stock Assessment Workshop, a formal scientific peer-review process for evaluating stock assessments; and the Stock Assessment Review Committee.

According to Weinberg, the revisions “make only modest adjustments to current operating practices.” At the same time, he said, “Formalizing these guidelines will strengthen the reliability and credibility of scientific information used by NOAA, thus improving public trust and benefiting stakeholders.” Weinberg said that, according to National Standard 2 guidelines, the definition of “best scientific information available” is “a dynamic process involving continuous improvements.”

“The message we should be conveying from the [science] center’s side is, we’re open to constructive change. I think it’s fair to say that much of our science is in dispute in the eyes of industry. I’m not seeing anything explicitly that deals with engendering more credibility with the industry.” – Bill Karp, Northeast Science Center  © Photo by Sam Murfitt

He said peer review standards were improved in order to avoid conflicts of interest by reviewers, who must not contribute to the development of the scientific information under review.

The revisions also emphasize the importance of SAFE reports as the source of science information on managed fish stocks; and provide that SAFE reports must be made public via website. SAFE reports are catch-alls that summarize all of the scientific information used to evaluate a fish stock.

Weinberg said the revisions are based on the concepts of relevance, inclusiveness, objectivity, transparency, timeliness, verification, validation, and peer review.

“’Relevance’ has to do with the pertinence to the current question,” Weinberg said.

If, for example, questions related to one species or coastline are also relevant for another, the evaluation results of one may be considered for the other.

“Inclusiveness” takes in the range of disciplines – for example, biology and oceanography – that might have information relevant to an evaluation. Diverse opinions are also important, including the local knowledge that comes from fishermen and other stakeholders.

“Objectivity” has to do with accuracy and precision of information, as well as the need to keep the scientific process free of undue nonscientific influence, Weinberg said.

“Transparency” has to do with open access to the review process, and allows for public comment at appropriate times.

“Timeliness” means that, when the science is being evaluated, it should be provided to fishery managers in a timely way, in order to avoid delays of management decisions. This implies that, in some cases, management decisions may be critical enough that the decision-making process needs to move forward even if the science isn’t fully evaluated and fleshed out.

“So there’s flexibility and reasonableness built into the guidelines,” Weinberg said. Verification and validation have to do with research methods used. Peer review is a process to ensure the quality of the scientific product.

Overall, said Weinberg, “There can’t be a tight definition of what ‘best scientific information available’ is, because science is constantly changing. So in order to evaluate what ‘best scientific information available’ is, one should consider all the criteria.”

NEFMC member David Pierce noted that, in 2011, fisheries scientists Michael Sissenwine and Brian Rothschild produced a national review of NMFS science. The report concluded that fishery research is generally of high quality.

“But they did find many problems, and they suggested recommendations that the process used to produce scientific advice to support management needs to be changed,” Pierce said.

The Sissenwine/Rothschild report’s recommendations aimed at improved collaboration with partners on science, cooperative research with industry and reviews of science programs; improved communication; clarification of roles and responsibilities; and improved data management systems.

Pierce continued, “I’m thinking that, as we move forward, we might want to reflect on those recommendations to improve the peer review process. I wonder, now, to what extent are we going to be constrained by this new guidance such that we won’t be able to go in the direction suggested by Sissenwine and Rothschild.”

“I don’t think we are constraining ourselves too tightly on this,” said Weinberg.

“The message we should be conveying from the [science] center’s side is, we’re open to constructive change,” said NEFSC director Bill Karp. Karp said it will be important to continue program reviews that were begun with the Sissenwine/Rothschild report.

“The conversation we need to have with the councils and other partners, with regard to changes we need to make in the overall process, is important,” Karp said.

One NEFMC member said the National Standard 2 revisions didn’t go far enough.

“I think it’s fair to say that much of our science is in dispute in the eyes of industry,” he said. “I’m not seeing anything explicitly that deals with engendering more credibility with the industry.”

But Weinstein suggested that improvements to the peer review process, stock assessment process, and data utilization are bound to result in better products. That, in turn, should spin off to better credibility with the public, he said.

“I think the best we can do is to follow the guidelines on the science end, and produce better products,” Weinberg said. “Reaching out to the public is very important, and that means doing the best science we can.”

CONTENTS