In the first eight parts of this series we learned how a breast cancer “epidemic” among wealthy white women was created by excess screening rates and promoted by the national media as a frightening reality. We learned about overdiagnosis and the true risk of getting breast cancer. We explored the perils of mammography and the hype around hormone replacement therapy. We deconstructed a series of influential scientific studies that turned out to be based on faulty data, unproven assumptions and racial bias. This week, we report that the California Cancer Registry data used to calculate the state’s burden of breast cancer is riddled with errors.

 

The California Cancer Registry was created in 1985 under the supervision of Kenneth Kizer of the California Department of Health Care Services. The state-run registry bills itself as the most modern, reliable and error-free registry in the world. But according to internal audits, experts who work with it and independent data-quality studies, some of the registry’s datasets are contaminated by misinformation.

A division of the University of California, Davis contracts with the California Department of Public Health to manage the registry’s central database. Human and computer errors are inevitable, and in a database of some four million records, there are bound to be glitches. But the registry’s data quality issues are more structural than performance-based. The problem is that even small numbers of errors can bias or invalidate research projects.

A 2013 data-quality audit found that 33 percent of a sample of new lung cancers recorded by registry employees during a single year contained wrong information about the spread of the tumors. Other audits revealed indolent breast cancers described as invasive, and invasive cancers described as indolent. These are substantial issues. And because audits are expensive and rarely performed, most errors go undetected.

The problems afflicting the cancer registries have been well known to industry insiders for decades. In 2003, scientists concluded that publicizing the poor quality of the California Cancer Registry’s data could “seriously undermine public confidence in this process.” California is not alone, though. In 2011, researchers from the Centers for Disease Control and the California registry investigated the accuracy of breast and prostate cancer data in a nationwide selection of registries. The study evaluated the fitness of registry data for research by comparing it to hospital records.

The report found that 53 percent of the cases examined were of “good quality or better.” Forty-seven percent were not.

 

Bureaucratic anarchy

The California Cancer Registry is part of a two-tiered national system set up to track cancer incidence and types. The National Cancer Institute funds SEER, which stands for the Surveillance, Epidemiology and End Results Program. SEER collects and stores hospital and physician-reported cancer data for 28 percent of the United States population; California alone supplies half of the SEER data.

The Centers for Disease Control separately funds the National Program of Cancer Registries, which covers the rest of the country through a state-based network. California’s registry system is funded by both the C.D.C. and the National Cancer Institute, and it reports to both. The management of the bifurcated national system is not centralized, however. The agencies do not use the same software systems for controlling cancer data, and they do not mirror each other’s data-quality metrics.

Adding to the bureaucratic anarchy, contractors working within the California registry themselves use often-incompatible software systems. Software disparities make it difficult to comb registry datasets for errors or add follow-up information about individual case histories.

There are other obstacles to collecting accurate and complete case data. Medical records are designed to facilitate reimbursement from insurance companies; they are not designed to assess the quality of patient care, or to support epidemiological investigations—the existential goals of the registry system.

State law requires hospitals and private practices to submit cancer registry forms for new diagnoses within six months, but that work is laborious and the reporting mandate is often ignored or incompletely accomplished.

Nevertheless, researchers have for decades relied on the accuracy and completeness of California Cancer Registry data to calculate incidence rates and assess the efficacy of treatments.

 

Stop the bleeding

In 2003, the Journal of the National Cancer Institute published a report called “Validity of Cancer Registry Data for Measuring the Quality of Breast Cancer Care.” The lead author was Jennifer Malin of the University of California, Los Angeles School of Medicine. The report compared California Cancer Registry breast cancer case abstracts to source medical records, looking for discrepancies.

The results were shocking. The researchers found that the worse the stage of the disease and the older the patient, the less accurate the registry data. They found that the registry missed or misrepresented significant numbers of biopsy findings and misidentified 6 percent of surgeries performed in hospitals. The registry failed to identify 44 percent of patients who had received chemotherapy and 64 percent of those who had received hormone therapies in outpatient clinics. Registry data did not agree with medical records for 58 percent of stage IV breast cancers.

“Reporting the quality of care derived from data with such validity problems could anger providers and seriously undermine public confidence in this process,” Dr. Malin wrote.

She predicted it would cost $250 million a year to substantially improve the registry’s accuracy. And even those improvements would likely be thwarted by “the fragmented nature of the U.S. health care system.”

Since the report was published, the reliability of registry data has further deteriorated, according to internal audits and progress reports.

In 2012, Dr. Malin joined a task force to brainstorm ways to improve the effectiveness of the cancer registry. Other members include Dr. Kizer, who manages the cancer registry’s central operation at the University of California, Davis; Michael Hogarth, a Davis professor specializing in medical information systems; and University of California, San Francisco breast cancer researcher Laura Esserman.

According to the task force, the problem with the registry is that it was designed to track cancer incidence, however imperfectly; it was not designed to collect information that could be used to improve the quality of medical care for patients.

In a March 2015 study published in the Journal of the National Cancer Institute, members of the task force observed that the California Cancer Registry often fails to report critical data about the biological nature of a patient’s cancer.

They observed that the state registry, like the national registry system as a whole, captures “limited information on initial treatments and no information on recurrence and subsequent treatments.” The report calls for the registry to collect more information about what occurs after initial cancer diagnoses.

It is worth noting that it is impossible to track treatment results without obtaining complete and accurate follow-up data on diagnosis, treatment and recurrence. As long as the California registry lacks reliable methods of reporting, experts say, tracking quality of care may be an intractable problem, despite the hoped-for efficiency of electronic records.

“More data is not always better data,” Dr. Hogarth remarked to the Light. The multiplicity of electronic record systems already in place “precludes organizing them into one reporting system,” he wrote in a 2008 report. Contrary to Silicon Valley dreams, electronic health record systems are “inflexible, proprietary, nonintuitive, expensive, difficult to maintain and rarely interoperable across health systems.”

Dr. Hogarth explained why the quality of California’s breast cancer data is particularly “sobering.” “The breast cancer information is sprinkled across the typical medical chart, with no place for key data to be summarized,” he said. “The first diagnosis is usually abstracted for the registry records, but there is seldom any follow-up. Doctors keep medical records to get reimbursed. Insurers pay for volume, not quality of procedures. And the codes keep changing. Monitoring the accuracy of registry coding is expensive and done only infrequently.”

Reforming the “dysfunctional” registry would require passing new state laws to redesign and govern it and its nonprofit contractors, Dr. Hogarth said. In fact, he and Dr. Esserman want to radically transform the registry system into one capable of producing better treatments for patients in real time. But, Dr. Esserman told the Light in an email, “First, we have to stop the bleeding.”

 

Money trails

The state health department contracts out the management of its regional operations to the Public Health Institute, the Cancer Prevention Institute of California and the University of Southern California. The National Cancer Institute has awarded each of these a $275 million contract. The Centers for Disease Control spends about $10 million a year to support registry operations.

During the statewide registry’s first quarter-century, the Oakland-based Public Health Institute largely ran the show. Public records show that from 2002 to 2012, the institute earned $200 million for supervising the collection and verification of cancer diagnoses submitted to regional registries by hospital staff and independent physicians, radiologists and pathologists.

In 2011, the California Department of Public Health went looking for a new state registry manager to replace the Public Health Institute. The bid solicitation explained, “By 1998, it had become apparent that the statewide regional registry model with eight separate regional registry systems, using eight separate databases, running on four completely different Cancer Registry Software solutions was inefficient, outdated [and] inherently created significant data quality issues such as duplicate cancer case reporting.”

An attempt to merge all of the data streaming from thousands of sources into a single platform did not solve the issue, even as millions of dollars were thrown at the complicated problem. And, decade after decade, the same people and institutions remained in charge as the system stalled.

In 2009, the California State Auditor questioned the legality of the Department of Public Health contracting out work that could be done by health department employees. The department’s solution was to award “grants,” as opposed to “contracts.”

The cosmetic change allowed the registry to keep its private contractors on the public payroll, while complying with the letter of the law by transferring the central data management function from the private Public Health Institute to a government entity.

In 2012, the department awarded a $29 million, five-year grant for overseeing the registry’s data systems to the Institute for Population Health Improvement at the University of California, Davis—which, to come full circle, is run by Dr. Kizer, who designed the registry nearly three decades ago.

But transferring data management authority to Davis did not improve data quality, public records show. And while a few executives played musical chairs, most employees kept their jobs. The Public Health Institute retained management of seven regional registries, the University of Southern California kept control the Los Angeles region and the Cancer Prevention Institute of California remained in charge of the two San Francisco Bay Area regions.

In 2013, the chief executive officer of the Cancer Prevention Institute, Sally Glaser, informed the National Cancer Institute that the Davis group was “without experience in this work,” a fact that may account for the continued decline of the registry’s data quality.

The grant calls for Dr. Kizer’s institute to fix the registry’s chronic data-quality problems, but his team has experienced difficulties hiring competent programmers because it must abide by the civil-service hiring system, which rewards seniority, not competence.

So Dr. Kizer punted, subcontracting the operation of the central computer system to a private company, Quest Media and Supplies, Inc. for more than $500,000 a year. According to public records, he asked the state to rewrite the grant to include a less burdensome scope of work for his beleaguered team.

Dr. Kizer delayed publishing the reconfigured registry’s first annual report on cancer statistics for several months while he engaged in a turf war with top officials at the state health department over who would get top billing for authorship of the annual report.

Public records also show that during the second year of the grant, Dr. Kizer’s team reported that “each of the eight regional registries had completely different methods and technical solutions of the process of contacting reporting sources of data. Some of the regions did not have a documented process.” He declined to be interviewed by the Light about how he plans to fix the registry’s structural and performance problems.

University of Chicago breast cancer risk specialist Olufunmilayo Olopade told the Light, “All of the California Cancer Registry’s epidemiological data needs to be reassessed and reanalyzed. It is one of the leading registries, but everybody who uses its data knows that it is imperfect. Anytime you have to code and recode and humans are entering data, there is always going to be error.”

 

CONNECTING THE DOTS

From November 2014 through February 2015, top officials at the California Department of Public Health; the University of California, Davis; the Public Health Institute; the Cancer Prevention Institute, Centers for Disease Control; and the National Cancer Institute sent streams of emails back and forth to each other, discussing how best not to provide the Light with interviews.

“Outsiders rarely look at these documents, so there could be a great deal of questions,” Greg Oliva, the assistant deputy director at the state Department of Public Health told staff in an email.

“We should definitely discuss this on tomorrow’s Directors’ call … [Mr. Byrne] has communicated with [the National Cancer Institute] and now [sent them a Freedom of Information Act request] for a lot of stuff. … Let’s all get on the same page,” Sally Glaser, the director of the Cancer Prevention Institute, wrote to Kurt Snipes, who oversees the state registry.

Kevin Sherin, a deputy director of the state health department, advised: “Sometimes, though, we need to build a relationship with reporters.”

But Mr. Oliva, the assistant deputy director, opined: “The challenge is that we are not likely to get this reporter to see our side given past history and the types of stories he is interested in. We just want a casual conversation to discuss his ask, refining it, minimizing it. He wants an ‘interview.’ Not the rules of engagement that favor our side.”

He emailed Dr. Snipes: “I am trying to get [Dr. Sherin] to let it go, but he somehow thinks we can make some magic and have a tea party with this guy and make everything go away.”

“I am not touching this any further,” Dr. Snipes replied.

Dr. Sherin piped in, “I wonder who he works for? Is there a relationship with someone worth building? Editor owner. Strategies.”

University of California, Davis public relations official Carol Gan emailed an executive at the state registry, Kenneth Kizer: “This is more than just a small newspaper story. Byrne is an investigative reporter/science writer [for] Scientific American.”

Mr. Oliva cautioned Dr. Snipes, “Kurt, I think [Dr. Sherin] needs some level of assurance that this is not going to bite us. We may not be able to ensure that. [Mr. Byrne] connected the dots [regarding] audit discrepancies.”

 

Next week, Busted! will dig deeper into the troubled findings of the California registry’s internal audits. The series will conclude with the story of a Marin woman who discovered that her registry record does not match the facts of her breast cancer diagnosis.

 

Read Part one here: https://www.ptreyeslight.com/article/busted-breast-cancer-money-and-media-part-one-renee-willards-story

Read part two here: https://www.ptreyeslight.com/article/busted-breast-cancer-money-and-media-part-two-climbing-risk-mountain

Read part three here: https://www.ptreyeslight.com/article/busted-breast-cancer-money-and-media-part-three-marin-syndrome

Read part four here: https://www.ptreyeslight.com/article/busted-breast-cancer-money-and-media-part-two-climbing-risk-mountain

Read part five here: https://www.ptreyeslight.com/article/busted-breast-cancer-money-and-media-part-five-does-hormone-replacement-therapy-cause-breast

Read part six here: https://www.ptreyeslight.com/article/busted-breast-cancer-money-and-media-part-six-perils-mammography

Read part seven here: https://www.ptreyeslight.com/article/busted-breast-cancer-money-and-media-part-seven-life-and-death-marin-womens-study

Read part eight here: https://www.ptreyeslight.com/article/busted-breast-cancer-money-and-media-part-eight-how-zero-breast-cancer-pays-its-bills

Read part ten here: https://www.ptreyeslight.com/article/busted-breast-cancer-money-and-media-part-ten-bad-data-audit-trail

Read part eleven here: https://www.ptreyeslight.com/article/busted-breast-cancer-money-and-media-part-eleven-story-abigail-adams-how-california-cancer