“Human progress depends not so much on the amount of knowledge
as on the speed of its circulation.”
Ernest Rutherford
“Knowledge is not a thing, but a relation.
It arises when we connect meanings.”
Ludwig Wittgenstein
In October, Tomsk became the venue for the forum “University Library: One Step Ahead,” which brought together more than four hundred experts from all over Russia. They discussed standards, data languages, and the culture of working with knowledge. In the new issue of his blog, TSU Rector Eduard Galazhinskiy explains why it is precisely the modernization of scientific and technical libraries that can set the pace for the country’s technological leadership, and how a humanistic framework — trust, openness, and responsibility — turns the knowledge infrastructure into the foundation of the future for science and education.
— Eduard Vladimirovich, science and education are currently undergoing a profound reconfiguration as key state institutions. Why does the country also need a new system of scientific and technical information at this difficult moment? Why not postpone it “until later”?
— The State System for Scientific and Technical Information (SSSTI) is the “circulatory” and at the same time the “nervous” system of the country’s “brain” — its science and education. It provides the motor functions of reforms in both: it feeds the entire knowledge ecosystem with data, results, and innovative ideas, and ties all these elements into a single whole, ensuring sensitivity and feedback. Universities can update educational tracks, science can generate discoveries, and industry can set technological tasks. But if knowledge is not constantly circulating — not being “seen,” not finding its addressee, not turning into a prototype and then into a product — the system loses speed and memory. SSSTI is designed to develop and maintain the speed of circulation of knowledge — from the initial idea and laboratory result to its implementation in a specific sector. It is a coherent and well-ordered infrastructure: unified rules of description, unified identifiers, unified “windows” of access and analytics.
At the same time, it is also a matter of culture: the ability of a nation to handle its own knowledge with care — to preserve it, standardize it, connect heterogeneous artifacts with one another — reports, data, publications, patents, prototypes — and return them into circulation in science, education, and industry. This is why synchrony with the reform of higher education is no coincidence: we are adjusting not only “what and how we teach,” but also “how we treat knowledge” as a national asset.
In addition, we are now at a point in time where three lines have converged. The first is historical. After the old institutions of scientific and technical information (STI) collapsed, we needed time to rethink their role. Today this new round of “reassembly” is logical: without a new SSSTI, the country will not be able to quickly scale up even the results it has already achieved. The second line is technological. Mature digital solutions have emerged that make it possible to ensure end-to-end traceability of data on research and development (R&D) — from collecting and systematizing information to registers and sectoral analytics. The third is political and legal. Decisions have been made that launch a federal project to modernize scientific and technical libraries as the front end of the future SSSTI and define the contours of integration and staffing for this system. In other words, conditions have formed under which we no longer need to “stitch together” fragmented practices but can build a coherent system: we have the rationale, the platform, and the mandate to act.
For reference:
Presidential Order No. Pr-616 of March 30, 2024 launched the federal project “Development of Scientific and Technical Libraries” and defined two key contours — digital and human-resource. The project is directly linked to the future State System for Scientific and Technical Information (SSSTI): modernized scientific and technical libraries become its “support and hands” on the ground, taking on large-scale collection, digitization, and normalization of data. The co-executors of the project are Tomsk State University, the Russian State Public Scientific and Technical Library (GPNTB Russia), the Library for Natural Sciences of the Russian Academy of Sciences (LNS RAS) and the All-Russian Institute for Scientific and Technical Information of the Russian Academy of Sciences (VINITI RAS).
In parallel, starting in 2024, the “Science and Innovation” domain began operating on the GosTech platform, where a number of digital services have been deployed and certified. In the relatively near future, this domain will serve as the main integration platform for end-to-end traceability of data on R&D, relying on a target architecture: a register of scientific and technical libraries and collections, a “single profile” of metadata (GRNTI, DOI, ORCID, ROR), data catalogs and showcases, STI analytics services, as well as integration via API/ETL with the Unified State Information System for R&D and the Russian Science Citation Index. In this configuration, libraries will function as the front end for collecting and verifying information, after which the data will flow into the domain’s unified information system for subsequent publication and sectoral analytics.
http://www.kremlin.ru/acts/assignments/orders/73759/print
https://gisnauka.ru/
— How was the Soviet SSSTI organized, and what role did it play in the development of Soviet science and technology?
— Formally, it was established in the mid-1960s, but its institutional core and the practice of “end-to-end” information support for complex programs emerged earlier, in the 1950s and early 1960s. By that time, specialized institutes of scientific and technical information were already operating, literature was being processed in abstracting services, standards of description and channels for delivering information to design bureaus and industries were being refined. The launch of the system in the mid-1960s formalized and scaled up what had taken shape as an effective mode of working with information. VINITI was placed at the center as the main methodological hub and aggregator, and a network of Centers for Scientific and Technical Information (CNTI) was built across the country as the “field hands” responsible for collecting and delivering information, while unified classifiers (later GRNTI) gradually became the common language. The point was to ensure reproducible speed: from the emergence of knowledge to its use in design and production. One cannot say that the Soviet SSSTI, for example, “created” the space and nuclear programs, but the experience of these grand projects clearly demonstrated why the country needs an institutionally formalized information system.
— What happened to this system?
— In the 1990s something occurred that is fatal for any system: its integrity was broken. Funding shrank, unified regulations fell apart, many CNTIs closed or changed their profile. By the mid-2000s, what remained were fragmented contours — electronic collections, individual databases, initial integrations with international identifiers, and state accounting of R&D — but without a common “spine” that would connect data collection, metadata standardization, catalogs, and sectoral analytics. At the same time, the key institutions — major STI centers and national libraries — survived. It is on this foundation that today’s “reassembly” is being carried out: instead of “patching” individual segments, we are restoring a connected system with unified rules, infrastructure, and personnel.
— What does the target architecture of the new SSSTI look like?
— A whole community of experts is currently working on the system’s architecture. However, it is absolutely clear that it must be based on a network of scientific and technical libraries, because this is where data are collected, digitized, verified, and enriched according to unified regulations, tied to common description languages — from GRNTI to international identifiers such as DOI, ORCID, and ROR. The second important component is services: a “single window” for researchers, managers, and industries, including catalogs, analytical dashboards, and application programming interfaces (APIs) for embedding data into sectoral circuits and state R&D accounting systems. In such a configuration, the library ceases to be just a repository and becomes a service center for knowledge, and data gain traceability throughout the entire life cycle — from a lab note to implementation.
— How will the “new” SSSTI differ from the “old” one?
— Essentially by a shift in logic. Previously, it was primarily a reference and abstracting model and departmental fragmentation: strong collections, but a lot of “manual searching” and few end-to-end links. The new model is service- and management-oriented: a single metadata profile instead of numerous local descriptions, common identifiers instead of “home-made” card files, catalogs and analytical dashboards instead of disconnected lists, a unified set of rules, protocols, and tools (APIs) instead of closed “corners” of information.
I would add one fundamentally important difference: a humanistic and legal framework is embedded in the technology. Access regimes, rules for citation and reuse, and responsibility for metadata quality will all be defined. In other words, a mere “collection of sources” is being replaced by a knowledge infrastructure designed for speed, traceability, and responsible use in education, science, and industry.
— How is the federal project “Development of Scientific and Technical Libraries of Russia,” for which Tomsk State University is one of the key executors, connected with the creation of the new SSSTI?
— It is important to emphasize that this is not a “parallel story” but the foundation of the new system. Let me explain simply. Any national information system does not live on a server but where knowledge is generated, recorded, and brought into order. Previously this “field work” was done by centers of scientific and technical information: they took on the routine tasks of collecting, describing, and delivering information to those who needed it. Today, scientific and technical libraries are assuming that role, but in a different logic: the library is a service center for knowledge. On its side are primary data collection and alignment with a single profile, assignment of identifiers, quality assurance, and basic services for researchers and engineers. Then the data follow along the integration contour into a unified information environment and become visible in the “single window” — in catalogs, analytical dashboards, and applied APIs for industries. Therefore, when we say “new SSSTI,” we must remember: without this “ground” — without a network of library-service centers — any central platform will resemble a perfectly built highway with no access roads leading to it.
And all these processes need to be accelerated right now, because, on the one hand, external restrictions on access to foreign databases and subscriptions are tightening. On the other hand, a technological window of opportunity is opening: tools for intelligent natural-language search, automatic classification, high-speed networks, and increased server capacity are already able to remove routine tasks from researchers and shorten the path from query to relevant knowledge.
In short, the conclusions are as follows: the system has its timeframe, roles, and resources; libraries have a new function as digital knowledge centers; and the professional community has an opportunity right now to influence standards so that by 2030 we will have a working scientific infrastructure capable of supporting the goal of technological leadership.
— You have just mentioned the growing restrictions on access by Russian researchers and engineers to foreign databases and subscriptions. How have the tasks of creating scientific and technical information systems been tackled in the West and, say, in China?
— If we look at the experience of other countries, it becomes clear that working with scientific and technical information has long been part of state strategy there. In the West and in China, it is not an auxiliary function but the basis of scientific leadership — an infrastructure without which discoveries do not turn into technologies.
In the United States, this path began after World War II. In the 1950s, the National Technical Information Service — NTIS — appeared. It collected the results of all research funded by federal agencies and made them available to universities, companies, and government bodies. Over time, NTIS became a full-fledged data platform with millions of documents, analytics, and API access. It has long been understood there that without standardized metadata and open channels of exchange, knowledge gets stuck at the level of reports.
In Europe, the main bet has been on integration. For twenty years, the OpenAIRE network has been operating, bringing together more than one and a half thousand universities and institutes. Every research result here — an article, a grant, a prototype — receives a digital identifier. This makes it possible to trace the path of knowledge from the laboratory to implementation and avoid duplication. It saves resources and speeds up the introduction of new technologies.
China covered this distance very rapidly. In the early 2000s, there were dozens of unconnected departmental databases. Today there is a national infrastructure for open scientific communication, PubScholar, created under the auspices of the Chinese Academy of Sciences. It includes millions upon millions of digital objects — from articles and patents to research data. Specialized platforms have emerged within this ecosystem, for example AstroCloud — “cloud astronomy,” where observation, processing, and publication are tied into a single cycle. It was precisely this discipline in working with data that produced the acceleration effect. Projects like the quantum satellite “Micius” became possible not only thanks to engineering solutions, but also because the infrastructure allowed data, results, and teams to be combined almost instantly. The essence is the same across all these examples: technological leadership rests not only on laboratories and equipment, but also on the speed and quality of knowledge circulation. Whoever builds the infrastructure controls the future.
— So, are we to understand that “there,” systems of scientific and technical information were created quickly, well-designed from the outset, and therefore highly effective?
— Of course not. Everywhere, such systems emerged with difficulty. In the West, the path to a modern knowledge infrastructure was long and contradictory. In the United States, everything began with the classic conflict between the state, business, and publishers. After World War II, thousands of reports on military and civilian R&D appeared in the country, but there was no unified mechanism for distributing them. When NTIS was created in the 1950s, part of the scientific community met it warily: many feared excessive state control over information, while private publishers worried about losing profits. In the 1980s, the debate flared up again when the open data movement began: federal agencies demanded transparency, whereas publishers and universities defended their subscription-based models. This conflict between “openness” and “monetization” has still not been fully resolved.
The second challenge was technological. In the 1990s, when the internet went mainstream, it became clear that every agency and university had its own metadata formats and classifiers. NTIS and other federal services had to spend decades bringing these disparate systems into a unified profile — through DOI, ORCID, DataCite. In fact, it took two generations for scientific information to become at least somewhat compatible.
In Europe, the process of creating STI systems proceeded via consortia and slow alignment of interests. OpenAIRE, for example, was built over almost fifteen years, and at every stage it was necessary to negotiate: which data should be considered “research results,” who is responsible for metadata quality, how to balance openness with the protection of intellectual property. Personal data in humanities and medical projects turned out to be particularly sensitive. That is why the European model is slower but more resilient: every decision there is born at the intersection of science, law, and ethics.
China, as always, went its own way — rapidly, but through a series of serious contradictions. In the early 2000s, when the country bet on technological leadership, it turned out that it lacked a unified map of knowledge: each academy and each ministry had its own databases and access rules. Information did not circulate between science and industry, and digital solutions developed as isolated “islands.” The first attempts at integration sparked debate: the state obtained a colossal array of publications, but access proved expensive, and control overly centralized.
The second challenge was data quality. China was growing faster than standards could be developed. Even the key metadata fields — authorship, language, affiliation — remained incomparable for a long time. Only by the 2020s was an identifier system approved that is comparable to international standards such as DOI and ORCID. The third problem was cultural. For a long time, open data exchange was seen as a risk — loss of priority, the possibility of misuse. It was necessary to form a new culture of working with knowledge, where transparency is not a threat but a resource for acceleration.
And finally, there is the issue of trust. Any centralized system requires feedback. In creating PubScholar and sectoral platforms such as AstroCloud, China sought a balance between architectural unity and freedom for research initiatives. Today this balance is being built through regional hubs and a “federated infrastructure” model. Here it is also important for us to understand that infrastructure must not grow faster than the culture that uses it.
There is also a common problem for everyone — personnel. STI systems require new professions: metadata engineers, data curators, specialists in scientific communication. In neither the US, nor Europe, nor China did these professionals appear overnight; they were trained “on the go,” and that is precisely why the introduction of new standards often stalled. So behind the external orderliness of foreign models lies enormous work on overcoming fragmentation and building trust. In this sense, our path is no different: we are simply taking this step in the digital era, when the speed of mistakes and the speed of corrections have become much higher.
— Let us return to the goals set for TSU and its partners under the federal project “Development of Scientific and Technical Libraries.”
— These goals are pragmatic. First, a unified information system capable of collecting and linking heterogeneous R&D results. Second, a digital register of scientific and technical libraries: who is located where, with which collections and services. Third, a target model of the library itself as a knowledge service center embedded in scientific-educational and industrial ecosystems. In the end, we must obtain not only regulations and “architecture diagrams” but also quite tangible things: vetted profiles of metadata and identifiers, standard processes for collecting and processing information, quality requirements, applied scenarios for industries, and, most importantly, trained people. All this directly “feeds” the formation of SSSTI: what is not collected and standardized “on the ground” will never appear in the “single window” — neither for the researcher nor for the customer from industry.
— What exactly is TSU responsible for in the project, and what, if we may put it this way, is the specificity of its role?
— TSU is primarily responsible for “linking meaning and technology.” First, for the scientific and methodological core: we are designing a flexible model of a scientific and technical library for different contexts — a university, a sectoral research institute, a provider of continuing professional education. It is not a single “ideal” scheme but a set of validated configurations: which collections and processes are needed, how to organize bringing data into a single metadata profile, where to introduce identifiers (GRNTI, DOI, ORCID, ROR), how to set up quality assurance and data routes to the integration contour. Second, as I mentioned above, we are responsible for the human-resource contour and are launching continuing education programs for library staff. Third, TSU is responsible for regulations and “interfaces”: we participate in fine-tuning rules — from requirements for describing R&D results and their identification to access scenarios, service catalogs, and APIs for science, education, and industry. And finally, we are responsible for testing hypotheses in the field: in October 2025, a large-scale study of the scientific and technical library network was conducted on the basis of TSU’s Scientific Library, and its results were presented at the All-Russian forum “University Library: One Step Ahead.” They will help to form the empirical framework on which the target model will be built.
Mikhail Shepel, Vice-Rector for Continuing Education at TSU; Chair of the TSU Library Council, one of the initiators of the federal project “Development of Scientific and Technical Libraries”:
— This year, Tomsk State University is developing a flexible model of a scientific and technical library and a corresponding standard to be approved by Technical Committee 191 “Librarianship” for standardization. To this end, with the support of the Ministry of Science and Higher Education, our university is conducting an All-Russian study of the current state of scientific and technical libraries, in which more than 650 organizations have taken part. In addition, this year TSU will formulate the main principles of competitive selection for the modernization of scientific and technical libraries in 2026–2029: conditions for participation, key expected results, and the competition procedure itself.
If we talk about a certain specificity or even uniqueness of our role, it lies in combining the traditions of a classical university with the current national methodological agenda. We operate a laboratory and a library at the same time, an educational trajectory and a technological track; that is, we know how to translate the “language of data” into the “language of teaching and implementation.”
Tomsk has historically been strong in ecosystem thinking: we have consortia, engineering schools, and a practice of joint projects with industry, and now this experience is being transferred to STI. Essentially, we are creating a model of a scientific and technical library as a knowledge service center: with a clear architecture of processes, with live training modules for each role, with measurable requirements for metadata quality, and with supply chains that deliver knowledge to the user — researcher, teacher, engineer, or sectoral customer. You could say this is the university’s “added value”: we not only describe “how it should be,” we train the people who will do it and create the sites where it can be quickly tested and scaled.
— By the way, if we move on to the human level — to the end user: what exactly will a researcher, a lab head, and an industrial partner gain?
— For the researcher, first and foremost, it is a “single window” with proper descriptions and identifiers: you find an article and immediately see the linked datasets, project reports, patent applications, prototypes, colleagues, and organizations via ORCID/ROR. The endless “manual search” through different corners disappears; time is spent not on double-checking but on work. Plus, routine tasks are automated: correct metadata are pulled into reporting, grant forms, and institutional profiles; references and citations do not need to be tracked manually, they arrive together with the record.
The head of a laboratory gets a dashboard for real, not paper, dynamics: what has already been done, what is in progress, where there are duplicates and bottlenecks, which results are ready to be transferred into teaching or into the technological circuit. All links become visible: from the scientific problem solved to a teaching module, to a contract with industry, to an intellectual property application. This reduces the risk of “internal losses” and makes it possible to manage not only publications but also the circulation of knowledge in a broader sense.
An industrial partner, finally, receives a transparent “radar” of technologies: catalogs of results, sectoral analytical dashboards, clear points of entry into a university or research institute, and APIs through which data can be embedded into their own processes. Plus an important guarantee of order: access regimes and rules for reuse are defined in advance, and there is no need to reinvent a legal scheme for each deal.
— How will all this affect the “speed” of the scientific cycle?
— Here “speed” is not a metaphor but a controllable parameter. It increases simultaneously in three loops. The first is search and analytics: a single metadata profile and identifiers reduce the time from a question to a corpus of relevant materials. The second is reproducibility and refinement: the interconnectedness of publications, data, and prototypes shortens the lag between “read” and “checked/improved.” The third is transfer: traceability from grant to pilot implementation removes brakes at the interface between science and industry, resulting in less duplication and fewer “losses in transmission.” From this follow new KPIs: time for search/verification, the share of results with correct metadata and identifiers, the time from project completion to appearance in a catalog and especially to the signing of a sectoral agreement. When a country gains the ability not only to measure these metrics but also to deliberately reduce them, the “speed of knowledge circulation” becomes a managed quantity, and this is, in fact, the key to technological leadership.
— If we talk about the country’s technological leadership, what fundamentally new things will the system provide in addition to “speed” and a “single window”?
— Without going into unnecessary technical detail, I would highlight several things that usually remain “behind the scenes,” but it is precisely they that make a technological leap reproducible. When a country acquires not just a “single window” but a live picture of where there are competency gaps and unresolved tasks, prioritization ceases to be intuitive. We see which studies will add the greatest effect to the overall development trajectory and adjust the plans of universities and industries accordingly. At the same time, the endless bureaucratic routine disappears: a researcher correctly describes a result once, and then the system itself “delivers” it into the necessary reporting and analytics circuits. This is not a trifle; it amounts to thousands of hours returned to actual science and engineering work.
There is another important layer: protection from going in circles. When publications, data, prototypes, and even failed attempts are linked into a single story, the system sends an early signal: “this has already been tried, and here is why it did not work.” We save resources at the level of programs and agencies and move faster with what is truly ready for the next step — from a lab prototype to testing and pilots. After that, regulatory mechanics kick in: when the effect of new solutions is transparently visible, it is easier to launch “sandboxes” and more quickly convert temporary rules into stable standards.
A separate topic is people. New roles do not emerge from presentations; they have to be grown “in the flow.” When training modules are embedded directly into the real processes of the system, students and young professionals enter the profession without a long “run-up,” and the quality of data and services grows here and now. What appears is what I call “sustainable memory,” which makes science and development less vulnerable to turbulence.
And finally, capitalization. When the origin of data is transparent, rights to reuse are spelled out, and routes for transferring results are standardized, our transaction costs decrease and fewer months are spent on coordination between universities and industry. Plus, an exportable product emerges — our knowledge-management practices, description standards, and educational solutions. They can be replicated across regions and sectors and, in some cases, offered in external markets. Taken together, this is the working logic of leadership: not one-off “miracles” but serial production of solutions based on visibility, reproducibility, and responsibility.
— If we boil the conversation down to a single thesis: what must the country learn thanks to the new SSSTI, and how will we be able to measure the result of this titanic work — the implementation of the federal project and the creation of SSSTI?
— If we reduce everything to one thesis, it would be this: we are betting not on some new information platform but on a new national discipline built on the assertion “we lose nothing, we do not do things twice, we see them through to application.” The outcome must be a new culture of working with technical information: in the laboratory, colleagues will correctly describe results; in the library, they will keep them “alive”; in management, they will rely on common rules; and in industry, they will return rapid feedback. The face of this reform is not a server room but people: a new-type librarian, a metadata engineer, a data architect, a teacher who introduces these practices from the first years of study. Once this culture is consolidated, speed and effectiveness will simply become side effects of normal work.
And we will measure progress by how much the path from result to use is shortened, how “losses in transmission” are reduced, and how the number of cases grows in which knowledge finds its addressee in time. We will do this openly — with a clear map of tasks and with pilots in which everyone can see their role. If we manage to maintain this human and professional framework, the country’s technological leadership will be a reward for all of us.
Eduard Galazhinsky
Rector of Tomsk State University
Member of the Presidential Council for Science and Education
Vice President of the Russian Academy of Education
Vice President of the Russian Union of Rectors