Acceptance criteria for new approach methods in toxicology and human health-relevant life science research – part I

Abstract

Every test procedure, scientific and non-scientific, has inherent uncertainties, even when performed according to a standard operating procedure (SOP). In addition, it is prone to errors, defects, and mistakes introduced by operators, laboratory equipment, or materials used. Adherence to an SOP and comprehensive validation of the test method cannot guarantee that each test run produces data within the acceptable range of variability and with the precision and accuracy determined during the method validation. We illustrate here (part I) why controlling the validity of each test run is an important element of experimental design. The definition and application of acceptance criteria (AC) for the validity of test runs is important for the setup and use of test methods, particularly for the use of new approach methods (NAM) in toxicity testing. AC can be used for decision rules on how to handle data, e.g., to accept the data for further use (AC fulfilled) or to reject the data (AC not fulfilled). The adherence to AC has important requirements and consequences that may seem surprising at first sight: (i) AC depend on a test method’s objectives, e.g., on the types/concentrations of chemicals tested, the regulatory context, the desired throughput; (ii) AC are applied and documented at each test run, while validation of a method (including the definition of AC) is only performed once; (iii) if AC are altered, then the set of data produced by a method can change. AC, if missing, are the blind spot of quality assurance: Test results may not be reliable and comparable. The establishment and uses of AC will be further detailed in part II of this series.

Usage of model combination in computational toxicology

Abstract

New Approach Methodologies (NAMs) have ushered in a new era in the field of toxicology, aiming to replace animal testing. However, despite these advancements, they are not exempt from the inherent complexities associated with the study’s endpoint. In this review, we have identified three major groups of complexities: mechanistic, chemical space, and methodological. The mechanistic complexity arises from interconnected biological processes within a network that are challenging to model in a single step. In the second group, chemical space complexity exhibits significant dissimilarity between compounds in the training and test series. The third group encompasses algorithmic and molecular descriptor limitations and typical class imbalance problems. To address these complexities, this work provides a guide to the usage of a combination of predictive Quantitative Structure-Activity Relationship (QSAR) models, known as metamodels. This combination of low-level models (LLMs) enables a more precise approach to the problem by focusing on different sub-mechanisms or sub-processes. For mechanistic complexity, multiple Molecular Initiating Events (MIEs) or levels of information are combined to form a mechanistic-based metamodel. Regarding the complexity arising from chemical space, two types of approaches were reviewed to construct a fragment-based chemical space metamodel: those with and without structure sharing. Metamodels with structure sharing utilize unsupervised strategies to identify data patterns and build low-level models for each cluster, which are then combined. For situations without structure sharing due to pharmaceutical industry intellectual property, the use of prediction sharing, and federated learning approaches have been reviewed. Lastly, to tackle methodological complexity, various algorithms are combined to overcome their limitations, diverse descriptors are employed to enhance problem definition and balanced dataset combinations are used to address class imbalance issues (methodological-based metamodels). Remarkably, metamodels consistently outperformed classical QSAR models across all cases, highlighting the importance of alternatives to classical QSAR models when faced with such complexities.

Analysis of health concerns not addressed by REACH for low tonnage chemicals and opportunities for new approach methodology

Abstract

In Registration, Evaluation, Authorisation and Restriction of Chemicals (REACH) the criterion for deciding the studies that must be performed is the annual tonnage of the chemical manufactured or imported into the EU. The annual tonnage may be considered as a surrogate for levels of human exposure but this does not take into account the physico-chemical properties and use patterns that determine exposure. Chemicals are classified using data from REACH under areas of health concern covering effects on the skin and eye; sensitisation; acute, repeated and prolonged systemic exposure; effects on genetic material; carcinogenicity; and reproduction and development. We analysed the mandated study lists under REACH for each annual tonnage band in terms of the information they provide on each of the areas of health concern. Using the European Chemicals Agency (ECHA) REACH Registration data base of over 20,000 registered substances, we found that only 19% of registered substances have datasets on all areas of health concern. Information limited to acute exposure, sensitisation and genotoxicity was found for 62%. The analysis highlighted the shortfall of information mandated for substances in the lower tonnage bands. Deploying New Approach Methodologies (NAMs) at this lower tonnage band to assess health concerns which are currently not covered by REACH, such as repeat and extended exposure and carcinogenicity, would provide additional information and would be a way for registrants and regulators to gain experience in the use of NAMs. There are currently projects in Europe aiming to develop NAM-based assessment frameworks and they could find their first use in assessing low tonnage chemicals once confidence has been gained by their evaluation with data rich chemicals.

Identification of the bacterial metabolite aerugine as potential trigger of human dopaminergic neurodegeneration

Abstract

The causes of nigrostriatal cell death in idiopathic Parkinson’s disease are unknown, but exposure to toxic chemicals may play some role. We followed up here on suggestions that bacterial secondary metabolites might be selectively cytotoxic to dopaminergic neurons. Extracts from Streptomyces venezuelae were found to kill human dopaminergic neurons (LUHMES cells). Utilizing this model system as a bioassay, we identified a bacterial metabolite known as aerugine (C10H11NO2S; 2-[4-(hydroxymethyl)-4,5-dihydro-1,3-thiazol-2-yl]phenol) and confirmed this finding by chemical re-synthesis. This 2-hydroxyphenyl-thiazoline compound was previously shown to be a product of a wide-spread biosynthetic cluster also found in the human microbiome and in several pathogens. Aerugine triggered half-maximal dopaminergic neurotoxicity at 3-4 µM. It was less toxic for other neurons (10-20 µM), and non-toxic (at <100 µM) for common human cell lines. Neurotoxicity was completely prevented by several iron chelators, by distinct anti-oxidants and by a caspase inhibitor. In the Caenorhabditis elegans model organism, general survival was not affected by aerugine concentrations up to 100 µM. When transgenic worms, expressing green fluorescent protein only in their dopamine neurons, were exposed to aerugine, specific neurodegeneration was observed. The toxicant also exerted functional dopaminergic toxicity in nematodes as determined by the “basal slowing response” assay. Thus, our research has unveiled a bacterial metabolite with a remarkably selective toxicity toward human dopaminergic neurons in vitro and for the dopaminergic nervous system of Caenorhabditis elegans in vivo. These findings suggest that microbe-derived environmental chemicals should be further investigated for their role in the pathogenesis of Parkinson’s disease.

Assessing network-based methods in the context of system toxicology

Abstract

Introduction: Network-based methods are promising approaches in systems toxicology because they can be used to predict the effects of drugs and chemicals on health, to elucidate the mode of action of compounds, and to identify biomarkers of toxicity. Over the years, the network biology community has developed a wide range of methods, and users are faced with the task of choosing the most appropriate method for their own application. Furthermore, the advantages and limitations of each method are difficult to determine without a proper standard and comparative evaluation of their performance. This study aims to evaluate different network-based methods that can be used to gain biological insight into the mechanisms of drug toxicity, using valproic acid (VPA)-induced liver steatosis as a benchmark. Methods: We provide a comprehensive analysis of the results produced by each method and highlight the fact that the experimental design (how the method is applied) is relevant in addition to the method specifications. We also contribute with a systematic methodology to analyse the results of the methods individually and in a comparative manner. Results: Our results show that the evaluated tools differ in their performance against the benchmark and in their ability to provide novel insights into the mechanism of adverse effects of the drug. We also suggest that aggregation of the results provided by different methods provides a more confident set of candidate genes and processes to further the knowledge of the drug’s mechanism of action. Discussion: By providing a detailed and systematic analysis of the results of different network-based tools, we aim to assist users in making informed decisions about the most appropriate method for systems toxicology applications.

Integrating Mechanistic and Toxicokinetic Information in Predictive Models of Cholestasis

Abstract

Drug development involves the thorough assessment of the candidate’s safety and efficacy. In silico toxicology (IST) methods can contribute to the assessment, complementing in vitro and in vivo experimental methods, since they have many advantages in terms of cost and time. Also, they are less demanding concerning the requirements of product and experimental animals. One of these methods, Quantitative Structure-Activity Relationships (QSAR), has been proven successful in predicting simple toxicity end points but has more difficulties in predicting end points involving more complex phenomena. We hypothesize that QSAR models can produce better predictions of these end points by combining multiple QSAR models describing simpler biological phenomena and incorporating pharmacokinetic (PK) information, using quantitative in vitro to in vivo extrapolation (QIVIVE) models. In this study, we applied our methodology to the prediction of cholestasis and compared it with direct QSAR models. Our results show a clear increase in sensitivity. The predictive quality of the models was further assessed to mimic realistic conditions where the query compounds show low similarity with the training series. Again, our methodology shows clear advantages over direct QSAR models in these situations. We conclude that the proposed methodology could improve existing methodologies and could be suitable for being applied to other toxicity end points.