Title Circulating (Poly)phenol Metabolites: Neuroprotection in a 3D Cell Model of Parkinson’s Disease

Abstract

The 140 amino acid protein α-synuclein (αS) is an intrinsically disordered protein (IDP) with various roles and locations in healthy neurons that plays a key role in Parkinson’s disease (PD). Contact with biomembranes can lead to α-helical conformations, but can also act as s seeding event for aggregation and a predominant β-sheet conformation. In PD patients, αS is found to aggregate in various fibrillary structures, and the shift in aggregation and localization is associated with disease progression. Besides full-length αS, several related polypeptides are present in neurons. The role of many αS-related proteins in the aggregation of αS itself is not fully understood Two of these potential aggregation modifiers are the αS splicing variant αS Δexon3 (Δ3) and the paralog β-synuclein (βS). Here, polarized ATR-FTIR spectroscopy was used to study the membrane interaction of these proteins individually and in various combinations. The method allowed a continuous monitoring of both the lipid structure of biomimetic membranes and the aggregation state of αS and related proteins. The use of polarized light also revealed the orientation of secondary structure elements. While αS led to a destruction of the lipid membrane upon membrane-catalyzed aggregation, βS and Δ3 aggregated significantly less, and they did not harm the membrane. Moreover, the latter proteins reduced the membrane damage triggered by αS. There were no major differences in the membrane interaction for the different synuclein variants. In combination, these observations suggest that the formation of particular protein aggregates is the major driving force for αS-driven membrane damage. The misbalance of αS, βS, and Δ3 might therefore play a crucial role in neurodegenerative disease.

Non-canonical BIM-regulated energy metabolism determines drug-induced liver necrosis

Abstract

Paracetamol (acetaminophen, APAP) overdose severely damages mitochondria and triggers several apoptotic processes in hepatocytes, but the final outcome is fulminant necrotic cell death, resulting in acute liver failure and mortality. Here, we studied this switch of cell death modes and demonstrate a non-canonical role of the apoptosis-regulating BCL-2 homolog BIM/Bcl2l11 in promoting necrosis by regulating cellular bioenergetics. BIM deficiency enhanced total ATP production and shifted the bioenergetic profile towards glycolysis, resulting in persistent protection from APAP-induced liver injury. Modulation of glucose levels and deletion of Mitofusins confirmed that severe APAP toxicity occurs only in cells dependent on oxidative phosphorylation. Glycolytic hepatocytes maintained elevated ATP levels and reduced ROS, which enabled lysosomal recycling of damaged mitochondria by mitophagy. The present study highlights how metabolism and bioenergetics affect drug-induced liver toxicity, and identifies BIM as important regulator of glycolysis, mitochondrial respiration, and oxidative stress signaling.

Definition of the Neurotoxicity-Associated Metabolic Signature Triggered by Berberine and Other Respiratory Chain Inhibitors

Abstract

To characterize the hits from a phenotypic neurotoxicity screen, we obtained transcriptomics data for valinomycin, diethylstilbestrol, colchicine, rotenone, 1-methyl-4-phenylpyridinium (MPP), carbaryl and berberine (Ber). For all compounds, the concentration triggering neurite degeneration correlated with the onset of gene expression changes. The mechanistically diverse toxicants caused similar patterns of gene regulation: the responses were dominated by cell de-differentiation and a triggering of canonical stress response pathways driven by ATF4 and NRF2. To obtain more detailed and specific information on the modes-of-action, the effects on energy metabolism (respiration and glycolysis) were measured. Ber, rotenone and MPP inhibited the mitochondrial respiratory chain and they shared complex I as the target. This group of toxicants was further evaluated by metabolomics under experimental conditions that did not deplete ATP. Ber (204 changed metabolites) showed similar effects as MPP and rotenone. The overall metabolic situation was characterized by oxidative stress, an over-abundance of NADH (>1000% increase) and a re-routing of metabolism in order to dispose of the nitrogen resulting from increased amino acid turnover. This unique overall pattern led to the accumulation of metabolites known as biomarkers of neurodegeneration (saccharopine, aminoadipate and branched-chain ketoacids). These findings suggest that neurotoxicity of mitochondrial inhibitors may result from an ensemble of metabolic changes rather than from a simple ATP depletion. The combi-omics approach used here provided richer and more specific MoA data than the more common transcriptomics analysis alone. As Ber, a human drug and food supplement, mimicked closely the mode-of-action of known neurotoxicants, its potential hazard requires further investigation.

Protectiveness of NAM-based hazard assessment – which testing scope is required?

Abstract

Hazard assessment (HA) requires toxicity tests to allow deriving protective points of departure (PoDs) for risk assessment irrespective of a compound’s mode of action (MoA). The scope of in vitro test batteries (ivTB) thereby necessitated for systemic toxicity is still unclear. We explored the protectiveness regarding systemic toxicity of an ivTB with a scope, which was guided by previous findings from rodent studies, where examining six main targets, including liver and kidney, was sufficient to predict the guideline scope-based PoD with high probability. The ivTB comprises human in vitro models representing liver, kidney, lung and the neuronal system covering transcriptome, mitochondrial dysfunction and neuronal outgrowth. Additionally, 32 CALUX®- and 10 HepG2 BAC-GFP reporters cover a broad range of disturbance mechanisms. Eight compounds were chosen for causing adverse effects such as immunotoxicity or anemia in vivo, i.e., effects not directly covered by assays in the ivTB. PoDs derived from the ivTB and from oral repeated dose studies in rodents were extrapolated to maximum unbound plasma concentrations for comparison. The ivTB-based PoDs were one to five orders of magnitude lower than in vivo PoDs for six of eight compounds, implying that they were protective. The extent of in vitro response varied across test compounds. Especially for hematotoxic substances, the ivTB showed either no response or only cytotoxicity. Assays better capturing this type of hazard would be needed to complement the ivTB. This study highlights the potentially broad applicability of ivTBs for deriving protective PoDs of compounds with unknown MoA.

Acceptance criteria for new approach methods in toxicology and human health-relevant life science research – part I

Abstract

Every test procedure, scientific and non-scientific, has inherent uncertainties, even when performed according to a standard operating procedure (SOP). In addition, it is prone to errors, defects, and mistakes introduced by operators, laboratory equipment, or materials used. Adherence to an SOP and comprehensive validation of the test method cannot guarantee that each test run produces data within the acceptable range of variability and with the precision and accuracy determined during the method validation. We illustrate here (part I) why controlling the validity of each test run is an important element of experimental design. The definition and application of acceptance criteria (AC) for the validity of test runs is important for the setup and use of test methods, particularly for the use of new approach methods (NAM) in toxicity testing. AC can be used for decision rules on how to handle data, e.g., to accept the data for further use (AC fulfilled) or to reject the data (AC not fulfilled). The adherence to AC has important requirements and consequences that may seem surprising at first sight: (i) AC depend on a test method’s objectives, e.g., on the types/concentrations of chemicals tested, the regulatory context, the desired throughput; (ii) AC are applied and documented at each test run, while validation of a method (including the definition of AC) is only performed once; (iii) if AC are altered, then the set of data produced by a method can change. AC, if missing, are the blind spot of quality assurance: Test results may not be reliable and comparable. The establishment and uses of AC will be further detailed in part II of this series.

Usage of model combination in computational toxicology

Abstract

New Approach Methodologies (NAMs) have ushered in a new era in the field of toxicology, aiming to replace animal testing. However, despite these advancements, they are not exempt from the inherent complexities associated with the study’s endpoint. In this review, we have identified three major groups of complexities: mechanistic, chemical space, and methodological. The mechanistic complexity arises from interconnected biological processes within a network that are challenging to model in a single step. In the second group, chemical space complexity exhibits significant dissimilarity between compounds in the training and test series. The third group encompasses algorithmic and molecular descriptor limitations and typical class imbalance problems. To address these complexities, this work provides a guide to the usage of a combination of predictive Quantitative Structure-Activity Relationship (QSAR) models, known as metamodels. This combination of low-level models (LLMs) enables a more precise approach to the problem by focusing on different sub-mechanisms or sub-processes. For mechanistic complexity, multiple Molecular Initiating Events (MIEs) or levels of information are combined to form a mechanistic-based metamodel. Regarding the complexity arising from chemical space, two types of approaches were reviewed to construct a fragment-based chemical space metamodel: those with and without structure sharing. Metamodels with structure sharing utilize unsupervised strategies to identify data patterns and build low-level models for each cluster, which are then combined. For situations without structure sharing due to pharmaceutical industry intellectual property, the use of prediction sharing, and federated learning approaches have been reviewed. Lastly, to tackle methodological complexity, various algorithms are combined to overcome their limitations, diverse descriptors are employed to enhance problem definition and balanced dataset combinations are used to address class imbalance issues (methodological-based metamodels). Remarkably, metamodels consistently outperformed classical QSAR models across all cases, highlighting the importance of alternatives to classical QSAR models when faced with such complexities.