News & Articles

Watch: CODATA/RDA Summer School 2018-2019 Interviews

Please watch - CODATA/RDA Summer School 2018-2019 Interviews:

Read more about CODATA-RDA School of Research Data Science


The Beijing Declaration on Research Data

Grand challenges related to the environment, human health, and sustainability confront science and society. Understanding and mitigating these challenges in a rapidly changing environment require data[1] to be FAIR (Findable, Accessible, Interoperable, and Reusable) and as open as possible on a global basis. Scientific discovery must not be impeded unnecessarily by fragmented and closed systems, and the stewardship of research data should avoid defaulting to the traditional, proprietary approach of scholarly publishing. Therefore, the adoption of new policies and principles, coordinated and implemented globally, is necessary for research data and the associated infrastructures, tools, services, and practices. The time to act on the basis of solid policies for research data is now.
The Beijing Declaration is intended as a timely statement of core principles to encourage global cooperation, especially for public research data. It builds on and acknowledges the many national and international efforts that have been undertaken in the policy and technical spheres on a worldwide basis.  These major contributions are listed in the Appendix. 
 Several emergent global trends justify and precipitate this declaration of principles:
  • Massive global challenges require multilateral and cross-disciplinary cooperation and the broad reuse of data to improve coherence concerning recent UN landmark agreements, such as the Paris Climate Agreement, the Sendai Framework for Disaster Risk Reduction, the Sustainable Development Goals (SDGs), the Convention on Biological Diversity, the Plant Treaty, the World Humanitarian Summit, and others. The comprehensive agendas for action provided by these agreements requires access to and reuse of all kinds of data.
  • Research and problem-solving, especially addressing the SDG challenges, are increasingly complex and driven by ‘big data’, resulting in the need to combine and reuse very diverse data resources across multiple fields. This poses an enormous challenge in the interoperability of data and responsible stewardship, with full respect for privacy.
  • Rapid advances in the technologies that generate and analyze data pose major challenges concerning data volume, harmonization, management, sharing, and reuse. At the same time, emerging technologies (including machine learning) offer new opportunities that require access to reusable data available in distributed, yet interoperable, international data resources.
  • Changing norms and ethics encourage high-quality research through greater transparency, promote the reuse of data, and improve trustworthiness through the production of verifiable and reproducible research results. Increasing the openness of research data is efficient, improving the public return on investment, and generating positive externalities.
  • Open Science initiatives are emerging globally, including in less economically developed countries. There consequently are opportunities for these countries to take advantage of technological developments to develop a greater share in scientific production. Without determined action, there is also a risk that the divide in scientific production will widen.
In September 2019, CODATA and its Data Policy Committee convened in Beijing to discuss current data policy issues and developed a set of data policies adapted to the new Open Science paradigm. The Declaration proposed below is the result of that meeting and is now put forward for public review.
The Beijing Declaration on Research Data read here
[1] In the attached document we deliberately use the word data very broadly, to comprise data (stricto sensu) and the ecosystem of digital things that relate to data, including metadata, software and algorithms, as well as physical samples and analogue artefacts (and the digital representations and metadata relating to these things).

Message from the CODATA President, Barend Mons

The field of research data and associated services is in a rapid - and epoch-making - phase transition from a data sparse to a data-overloaded ecosystem. Many national and international efforts are underway to try and deal with the enormous challenges posed by instrumentation and automation and the associated explosion in the volume and complexity of data. We all try and keep pace with this phenomenon by deploying the analytical processes and tools needed to enable data-intensive science, supported by machines. In order that high throughput data generation instruments and computers may effectively support the scientific and innovation process, both data and workflow components need to be machine-actionable. Building on and refining many earlier efforts, in 2014 the FAIR principles were formulated. These principles recommend that data (and services around them) should be Findable, Accessible, Interoperable and (thus) Reuseable, first and foremost by machines.

 In 21st century science, computers need to be fully enabled to do the hard work of processing, pattern identification and machine learning in relation to enormous amounts of heterogeneous, distributed data. Human researchers, and the science system as a whole, will benefit from machine-actionable data as less time will be spent data munging. When data is stewarded and processed properly, ambiguity and non-reproducibility will be less of a problem as well. In addition, many datasets and resources are now either too large or too privacy sensitive, or both, to be effectively routed around the globe for multidisciplinary and data-intensive science projects. Therefore, distributed machine learning is a new paradigm that I refer to as ‘data visiting’ rather than the classical model of ‘data sharing’.

 These rapid changes have in significant respects ‘taken science by surprise’ and many groups and infrastructures have great difficulties to adapt to this revolutionary new way of doing science. Rather than ‘excellence in silos’, and scholarly communication mainly designed for person-to-person information and knowledge transfer, we now need ‘excellence across silos’. We need to conceive of the underpinning ecosystem as -in essence- one computer with one, universal dataset. Workflows dealing with data and the data themselves are being reused over and over and need to be fully interoperable, reusable and reproducible. In particular when we address the major challenges facing our planet, as laid out in the Sustainable Development Goals, the data needed to gain the necessary insights come from many different domains and are frequently not purposefully generated for research. For an ‘Internet of FAIR Data and Services’ to emerge and flourish, all digital resources should be intrinsically FAIR and processable outside the environments and systems in which they were created. In other words, they need to be universally reusable. The good news is that computers can translate FAIR digital resources from one format to the other with high speed and minimal error rates as long as the machine has enough information about the resource. Another way of expressing the objective of FAIR is that when the resource is FAIR, ’machines know what it means’. In essence, the machine can answer three major questions for each FAIR digital object or resource they encounter: 

  1. What is this?,
  2. What operations can be performed on it? and,
  3. What operations are allowed?

With properly constructed FAIR digital resources, these questions can be answered, which enables machines (and thus also ultimately humans) to reuse them with full provenance outside their original context. Elusive as this may sound, I am very confident that the current international efforts in this exciting domain will soon yield the first scalable ecosystems that follow these principles, and major industries are already moving into this space as well. So be warned: the coming four years will not be ‘science as usual’!

CODATA has been around for roughly 50 years, and has lived in the data sparse times as well as now in the data rich era, which poses entirely different and daunting challenges, also for CODATA itself. CODATA, as a committee of the International Science Council (ISC), supporting the mission of ISC as the global voice of science and its role in the UN system, has the responsibility to fill a specific and strategic niche in the global ecosystem of research data related activities. Many other organisations have complementary roles that are either domain specific, national or regional or they are grass roots and community based. CODATA is actively engaging with these other international players in defining complementary and synergistic roles.

The data-intensive science and innovation challenge is obviously a global one, it should equitably involve all regions of the world and it cannot be solved sustainably within disciplinary or national silos. That is the niche in which CODATA should operate. CODATA also has a key role to play in the involvement of regions of the world that have been traditionally data and science-deprived. With the Internet of FAIR Data and Services emerging 'as we click’, we should not widen the digital divide but leap-frog to close it, such that the new research ecosystem is also fair in the traditional sense. Open Science, must also mean that no-one is left behind. The second bit of good news is that activities in the Global South are emerging at an early stage and some are ambitious enough to lead future developments. 

As the CODATA President I work with the Executive Director, with the officers and Executive Committee, and with CODATA’s core staff to serve this multi-organisational ecosystem in service of the global science community. We also work with regional organisations such as the European Commission and the EU Member states with their major leading initiative for the European Open Science Cloud, which has an increasing number of partner initiatives in other regions. We build on the excellent work of our predecessors in CODATA, including the intellectual leadership of the past President Geoffrey Boulton and in close collaboration our parent organisation, the International Science Council.

As of 2017, and extending for the duration of my CODATA presidency, I also serve on the US National Academy of Sciences Board for Research Data and Information. With my election as president of CODATA, I will gradually hand over operational leadership in GO FAIR to others, and I will seek to play an ambassadorial role for both, to help drive a joint, converging and balanced ecosystem for international policies supporting open, data driven science. We also work to consolidate and make explicit the key role for each of the internationally operating data organisations and in particular to bring RDA, GO FAIR, WDS and CODATA even closer together, with clear and complementary mandates. When we lock arms at all levels from institutional to international, I am optimistic that by the end of my term as President, the first phase of the Internet of FAIR data and services will be up and running.

For all this to happen, it will be of critical importance that each of the data supporting organisations is mandated and properly funded (although at the leanest necessary level) to serve the science and innovation communities, without competing for the same funds as the community they should serve. They should focus on those supra-level tasks that never make it to the top of the priority list of individual countries, regions, funders, researchers and innovators. In this set of partnerships, it is the CODATA mission to act strategically and globally to advance equitable Open Science, the FAIR ecosystem and to make data work for interdisciplinary global challenge research.

Research infrastructures have traditionally been almost an ‘afterthought’ or considered ‘other peoples’ problem’, which has resulted in a very dangerous situation where core resources, massively used by researchers, such as curated data bases and collections, mapping and standard services are ‘operating on a shoe string’ and go through a near-death experience each time funded projects run out. We, as the research community, should collectively speak with one voice, on these infrastructural and interoperability issues as trusted representatives of the real needs of the research community itself and society as a whole, towards policy makers, funders and unions dealing with the enormous data and analytics challenges we will face in the decades to come. It is an honour to be elected as the new president of CODATA and I hope to serve the community as expected.

Disaster Risk Reduction and Open Data Newsletter: November 2019 Edition

UN High Commissioner for Refugees: Climate Change and Displacement Climate change and natural disasters can add to and worsen the threats that force people to flee across international borders. The interplay between climate, conflict, poverty and persecution greatly increases the complexity of refugee emergencies.

Victoria, Australia - National Climate Change and Agriculture Plan Agreed Australian ministers met in Melbourne at the Agricultural Ministers’ Forum to endorse a Victorian-led program that will facilitate collaboration between state and Commonwealth governments to meet the challenges of climate change and support the agriculture sector to adapt.

Flood forecasting a cyclone game-changer for Fiji The ground-breaking project has developed and implemented a Multi-Hazard Early Warning System (MHEWS) that delivers an integrated approach to forecasting, monitoring and warning for coastal flooding, no matter what the cause - river or ocean.

Bangladesh to move Rohingya to flood-prone island  Bangladesh will start relocating Rohingya Muslims to a flood-prone island off its coast as several thousand refugees have agreed to move. 

Tasman fire review finds shortfalls in New Zealand's preparedness for large-scale blazes A review of firefighting efforts during the Tasman fires last summer, which cost Fire and Emergency New Zealand $13 million, has found shortfalls in the number of skilled staff working in risk management.

UNSDSN TReNDS - SDG Financing Initiative In 2018, SDSN launched and became the Co-Chair of a Working Group on SDG Costing & Financing with the IMF, OECD, and World Bank. This group convenes sector experts to aggregate their respective costing models and data for SDG targets, especially for low-income countries.

Addressing the Challenges of Drafting Contracts for Data Collaboration Contracts for Data Collaboration (C4DC) is a new initiative seeking to address barriers to data collaboration. The partnership, launched in early 2019, has already yielded a number of outputs, including a project inception brief, the Contractual Wheel of Data Collaboration tool — which presents key considerations for the development of data sharing agreements — and an initial analytical framework.

October 2019: Publications in the Data Science Journal

  Title: Different Preservation Levels: The Case of Scholarly Digital Editions Author: Elias Oltmanns, Tim Hasler, Wolfgang Peters-Kottig, Heinz-Günter Kuper
  Title: A Method for Extending Ontologies with Application to the Materials Science Domain
Author: Huanyu Li, Rickard Armiento, Patrick Lambrix URL:
  Title: Analysis of Several Years of DI Magnetometer Comparison Results by the Geomagnetic Network of China and IAGA
: ufei He, Xudong Zhao , Dongmei Yang, Fuxi Yang, Na Deng, Xijing Li

Read more CODATA news