From manuscript preparation to business-related best practices, scholarly publishers increasingly integrate data capture and analysis into their systems. These efforts are considered essential to enable interoperability, ensure transparency, and build trust with authors, funders, and institutions.

On Thursday, October 20, at the Frankfurt Book Fair, CCC presented The Data Quality Imperative: Improving the Scholarly Publishing Ecosystem. a special Frankfurt Studio session. Watch the recording.

Click below to listen to the latest episode of the Velocity of Content podcast.

Panelists Sybille Geisenheyner, Director of Open Science Strategy & Licensing, American Chemical SocietyDr. Johanna Havemann, Trainer and Consultant in Open Science Communication, and Co-founder & Strategic Coordinator, AfricaArxiv; and Laura Cox, Senior Director, Publishing Industry Data, CCC, all shared insights with me on the impact of consistent data quality strategy on the people, processes, and technology in your organization.Frankfurt Book Fair 2022 Panel

“There needs to be trust in the data, and that is always something we start with,” said Geisenheyner. “The data will never be 100% perfect, but it can get even better. How many articles get published? What is the APC spend on an individual basis? What is the subscription spend? We do have a lot of data. And to share that data is key.”


Author: Christopher Kenneally

Christopher Kenneally hosts CCC's Velocity of Content podcast series, which debuted in 2006 and is the longest continuously running podcast covering the publishing industry. As CCC's Senior Director, Marketing, he is responsible for organizing and hosting programs that address the business needs of all stakeholders in publishing and research. His reporting has appeared in the New York Times, Boston Globe, Los Angeles Times, The Independent (London), WBUR-FM, NPR, and WGBH-TV.
Don't Miss a Post

Subscribe to the award-winning
Velocity of Content blog