By: Thomas E. Levy, MD, JD | Orthomolecular News Service | Posted on August 29, 2023 GreenMedInfo.com

Most clinicians are familiar with the concept that when a little is good, more is often better, but a lot is still reliably toxic. This results in the mindset of there being little chance of doing harm with supplementing a “little,” especially when the supplements involved are well-known and relatively popular supplements, widely regarded as being beneficial without question. In the case of calciumiron, and copper the downside of minimal supplementation could not be more clear-cut. All three of these agents are essential for health, especially inside the cells. Nevertheless, once a relatively low daily intake of these nutrients is exceeded only minimally, toxicity rapidly ensues, with the highest intakes resulting in the greatest toxicity. This is in great contrast with some other nutrients, such as vitamin C, niacin or niacinamide, or vitamin K2, where toxicity is difficult to reach at any degree of intake or supplementation. Many other nutrient supplements, especially minerals, can readily be ingested to the point of toxicity, but the amounts needed are still much more difficult to reach compared to the minimally toxic intakes of calcium, iron, and copper.

Cellular Calcium Excess Underlies Disease

The marketing efforts of the dairy industry over the years have been wildly successful in convincing the public as well as most doctors that high calcium intake by diet (especially dairy), and by extension, supplementation as well, is of clear benefit to general health and for healthy bones as well. Unfortunately, the exact opposite is true, and excess calcium intake is the primary fuel sustaining and even provoking heart diseasecancer, and all chronic degenerative diseases.

Elevated intracellular calcium levels are present in all cells affected by disease processes, and very high levels are present in all malignant cells.

Furthermore, when therapeutic measures are taken to reduce these calcium levels, healthier cells always result. [1]

Several straightforward studies revealed the tremendous toxicity of too much calcium. In a study on 61,433 Swedish women followed for a median of 19 years, those who ingested the most total calcium from both diet and supplementation had an all-cause mortality 250% greater than those with the lowest intakes of calcium. Similarly, the same group with the greatest calcium intake had a more than 200% increase in mortality from coronary artery disease. [2] A meta-analysis of 15 trials also clearly demonstrated an increased risk of myocardial infarction in patients who took calcium supplements. [3]

The Coronary Artery Calcium (CAC) score has been used for over 30 years now to monitor the likelihood of a patient dying of coronary heart disease (myocardial infarction). A higher score indicates an increased chance of cardiac mortality. The CAC score is generated by a CT (computed tomography) scan over the heart. Greater amounts of calcium deposition in the coronary arteries consistently result in higher CAC scores. [4] Therapeutic measures that can increase or decrease this calcium accumulation correlate directly with an increased or decreased chance of cardiac mortality.

Recent research now indicates the CAC score is clearly predictive of all-cause mortality and not just death from coronary artery disease. [5] This indicates that the CAC score also serves as a reliable marker test for indicating the degree of calcium excess throughout the body and not just in the coronary arteries. Having a high amount of calcium deposition in the coronary arteries indicates calcium excess everywhere, even if it is only inside the cells and not as readily detected as calcium deposits. While some excess intracellular calcium can still be present when the CAC score is zero (normal), any positive score assures the presence of such excesses, with higher scores indicating greater excesses and greater degrees of pathology in the body.

Menopause, with its loss of estrogen production in the affected women, contributes directly to increased intracellular calcium levels.[6] Normal estrogen levels are very effective in minimizing cytoplasmic calcium levels as estrogen serves as a calcium channel blocker, limiting calcium uptake into the cells. Consistent with this, it has now been shown that menopause does promote increased CAC scores. [7] Testosterone, the male sex hormone counterpart to estrogen, also serves as a calcium channel blocker throughout the body. [8] This important relationship of increased intracellular calcium levels resulting from decreased sex hormone presence only further underscores the importance of giving some sex hormone support to all older patients, even when the hormone levels are still technically above the lowest levels in the laboratory reference range.

To be clear, the very well-defined relationship between calcium content inside the cells of the body and disease-causing increased intracellular oxidative stress really means only one thing: Never supplement calcium.

Iron and Copper: The Toxic Transition Metal Twins

Why call these two metal twins? Basically, it is because these metals are both prominent promoters of the Fenton reaction inside the cells of the body. All cells contain them, but when these metals increase in concentration by even the most minimal degree, oxidative stress rapidly ramps up. And as increased oxidative stress (where excess biomolecules are in the oxidized, electron-depleted state) is maintained or even further increased, abnormal cell function (“disease”) flourishes.

The oxidation stimulated by ionic iron (Fe3+, Fe2+) and ionic copper (Cu2+, Cu1+) remains minimal (“physiological”) as long as no significant new intakes of these metals occur, especially when unwittingly supplemented. Most reasonably balanced diets will never supply too much of these metals, although this delicate balance is easily disrupted by the most minimal of supplemental intake.

The Fenton reaction plays a major role in the ability of the body to kill pathogens, pathogen-infected cells, cancer cells, and cells with massively increased intracellular oxidative stress that are on the border of necrosis and/or other forms of cell death like apoptosis. When not properly balanced, it also plays a major role in the chronic toxicity inflicted by supplemental iron and copper intake. Both metals are known as transition metals because they readily shuttle electrons through various metabolic pathways. This ease of electron passage is why iron and copper conduct electricity so well (current is electron flow).

The classic Fenton chemistry is seen in a pathogen-infected cell, especially when provoked by a sufficient administration of vitamin C. While vitamin C has many different immune-supporting and anti-pathogen properties, it is the promotion of Fenton chemistry inside the cell that likely accounts most directly for its infection-resolving properties.

The most virulent of pathogens are the most avid consumers of iron. It is this characteristic that literally allows most pathogens to self-target themselves since this iron excess so strongly fuels the Fenton chemistry metabolism. Of note, some antibiotics owe much of their effectiveness due to their ability to chelate iron, thereby weakening the pathogen as it loses access to new sources of iron for its growth.

The following activities describe a typical Fenton reaction-fueled destruction of a pathogen and its host cell:

  • 50 or more grams of vitamin C is infused.
  • As the vitamin C floods the extracellular space, active and passive vitamin C transporters increase vitamin C levels inside the cells.
  • At the same time in the extracellular space, the vitamin C continually stimulates the formation of relatively large amounts of new hydrogen peroxide.
  • The hydrogen peroxide, which is already elevated inside the cell that is infected, with its attendant focal hypoxia and acidosis, readily passes from the extracellular to intracellular spaces
  • The vitamin C donates an electron to Fe3+ (or Cu2+), with a reduction to Fe2+ or Cu1+.
  • The reduced metal then donates an electron to the hydrogen peroxide that is present, resulting in its prompt breakdown into a highly pro-oxidant entity known as hydroxyl radical.
  • This radical is so reactive that it cannot migrate but immediately oxidizes whatever it is adjacent to when it is formed.
  • Sustained formation of new hydroxyl radicals increases oxidative stress rapidly to the point of pathogen/cell rupture and death.
  • Hydrogen peroxide inside the cell helps mobilize Fe3+ from ferritin storage ensuring adequate reactive iron to fuel the Fenton reaction to completion.
  • Therefore, continued VC infusion assures that all components of the Fenton reaction are sufficiently present to continue until pathogen/cell death has been achieved. No substrates run out prematurely.

This interaction of vitamin C with copper (or iron), with the subsequent upregulation of the highly pro-oxidant Fenton reaction, was nicely demonstrated in a study on mice. It was clearly shown that the simultaneous administration of vitamin C and copper led directly to increased systemic oxidative stress and kidney cell injury. [9] However, without the copper present, protection against increased oxidative stress in the kidneys is readily achieved by vitamin C alone. [10]

It is well-established that both iron and copper play important roles as cofactors in various metabolic pathways and enzyme reactions. However, the total pool of reactive iron and copper in the body that plays these roles is extremely small, and it is almost completely maintained by an ongoing recycling of these metals inside the cells. Very little of these metals gets excreted, and so very little new intake is needed for them to perform these various metabolic functions. Nevertheless, these two metals have a powerful negative synergy in causing pathology. A good example of this is the elevated levels of both copper and iron that are found in human atherosclerotic plaques. [11]

Relatively massive amounts of iron, and to a much lesser extent, copper, are needed to maintain the normal synthesis of new red blood cells, compared to the very small amounts required for their cofactor functions. A copper deficiency anemia is quite rare, but the iron deficiency anemia is much more common. [12,13] However, the iron deficiency anemia rarely ever occurs without a significant loss of blood, as can be seen with heavy menses or a bleeding gastrointestinal tumor. The bottom line for patient management, however, is that when there is a normal hemoglobin and a normal hematocrit, NO iron or copper should ever be supplemented. The ferritin level can never be considered too low and a reason for iron supplementation if the hemoglobin level is normal. Any such supplementation needlessly and reliably fuels excess oxidative stress throughout the body.

Too much supplementation guidance comes from researchers that have found that agent X has some effect on enzyme or protein Y, without any regard for the general health of the research subject or the stability of serial blood examinations over time. Taking a “deep dive” on trying to understand as much as possible about a supplement is fine, but the “macro” study should always be given much greater regard than the “micro” study, especially when such “micro” studies are taking place in animals or test tubes, and extrapolations are being made as to what supplement is good for the entire human body.

A good example comes from some studies looking at the interactions between vitamin C and copper. In rats it has been shown that increased dietary vitamin C increased blood levels of the vitamin C while reducing the plasma and tissue concentrations of copper. [14] In another study, both men and guinea pigs were supplemented with vitamin C. The supplementation increased ceruloplasmin (copper-carrying protein) levels in the men while decreasing them in the guinea pigs. The authors concluded that vitamin C has an antagonistic effect on copper metabolism in guinea pigs but not in humans. [15] A cell study concluded only that vitamin C exerts both positive and negative regulatory functions in copper metabolism, while stating that the mechanism is unclear. [16]

When trying to make any sense of the advisability of copper supplementation after reviewing the studies above, consider that the ability of vitamin C, at least in rats, to reduce the plasma and tissue concentrations is a good outcome. Based on the ease with which added copper can worsen oxidative stress, the ability of vitamin C to minimize its presence in the body can easily be considered a good outcome, without just assuming that a chronic lowering impact on copper presence in the body is not desirable. Linus Pauling started taking 3 grams a day of vitamin C as soon as he learned of it in the 1960s, gradually increasing the amount until he was taking 18 grams a day the last years of his life. Dr. Pauling died at age 93, and he was clear enough of mind to be giving lectures/speeches up to the last few months of his life. If Dr. Pauling was suffering from a vitamin C-induced deficiency of copper in his body, there was no clinical evidence that harm was being done. Quite the contrary, the loss of the ability of the human liver to make mega-gram doses of vitamin C daily and release it directly into the bloodstream argues that any copper-lowering effects of vitamin C are completely desirable, and that most of the human population is dealing with some degree of copper toxicity already that is no longer being alleviated by the missing endogenous production of vitamin C in the genetically-defective human liver. [17]

The consistent relationship between elevated copper levels and carcinogenesis should cause grave concern for health seekers who regularly supplement copper. Many studies have consistently shown that those individuals with the highest blood levels of copper contract, and sustain, the most cancers. Also, just as cancer cells “feed” on iron, they also are fed by copper. It appears that the continued presence of more copper is a major factor in both causing the initial malignant transformation of the cancer as well as in fueling its aggressive growth and spread. Tumor copper levels and blood copper levels are elevated in a wide variety of cancers, including the following: breast, cervical, ovarian, lung, gastric, bladder, thyroid, oral, pancreatic, and in the head and neck. [18-31] Furthermore, higher serum levels of copper are seen in more advanced stages of cancer and are directly correlated to how readily the malignancy grows. [32] In some hematological malignancies, periods of cancer remission are seen as serum copper levels become lower. [33] And just as less copper can induce cancer remission, more copper can be given as oxidation-inducing “chemotherapy” to push the already increased intracellular stress (Fenton reaction) even higher, with cancer cell death eventually resulting. [34,35]

Always look for the longevity (all-cause mortality) studies to get the clearest picture of the ultimate impact of an agent on the human body. The impact that something has on isolated metabolic functions in the cytoplasm is often completely irrelevant (the “accumulation of minutia,” as Dr. Robert Cathcart once noted) if it argues against taking a supplement by itself after such a supplement was already proven to decrease all-cause mortality. And this is even more the case when it is an animal or in vitro study. As Dr. Abram Hoffer once noted, studies can make good scientific points while still remaining “clinically unimportant.”

Furthermore, such “micro” data should never be used, intentionally or unwittingly, to strike fear into the hearts of potential supplementers of a clearly beneficial agent, such as vitamin C. It has been long-established now in “macro” studies that individuals who maintain the highest blood levels of vitamin C live the longest. [36,37] In the face of such data, if vitamin C truly does work to lower copper levels in the body, then that would appear that such an effect is highly desirable and the only long-term effect of chronic high doses of vitamin C on copper levels would be keeping them less elevated, but never causing widespread deficiencies. Furthermore, such data also supports the concept that truly copper-deficient individuals, except under the most extraordinary of circumstances, do not exist. The clear-cut conclusion is simply this: never supplement copper.

Iron, a transition metal like copper, reliably increases oxidative stress wherever it is found in its free, unbound state. Much, perhaps most, of this increased oxidative stress from iron results from an upregulation of the Fenton reaction throughout the body. While body-wide copper excess is easily inferred in nearly everyone from the data discussed above, it is not as easy to clearly establish the presence of such an excess through blood testing. On the other hand, excess iron in the body is reliably reflected in the ferritin blood test. [38,39] Ferritin is an intracellular protein that stores iron, releasing it from storage as needed by the body. Higher ferritin levels indicate high iron content in the body, although certain conditions, often inflammation secondary to acute infections, will increase the numbers on this test while still not reflecting increased amounts of iron in the body. This occurs since ferritin is an acute phase reactant, representing a leakage product from damaged and dying cells. [40] Severe infections can result in astronomical elevations in the ferritin level, as in advanced COVID patients. [41]

While perhaps counterintuitive, serum iron levels have little to no correlation with the amount of excess iron stored in the body. However, as more unbound (“free”) iron enters the body in excess of the amount of iron needed to sustain normal metabolic needs, the more the synthesis of ferritin is stimulated, and the excess iron is promptly stored inside the shell-like structure of the ferritin molecule. Not surprisingly, as free iron is a common culprit in stimulating excess oxidative stress, the presence of extra oxidative stress itself results in the synthesis of more ferritin, allowing the excess pro-oxidant free iron to be removed from the involved tissues. [42] Higher ferritin levels not only indicate increased stores of iron in the body, they also indicate an ongoing attempt by the body to synthesize enough ferritin to keep extracellular and intracellular levels of free iron at nontoxic levels. When free iron is very minimal, ferritin levels can drop very low, since they primarily needed as a buffer against excess iron presence.

Iron excess is so pandemic around the world (except for extremely malnourished third world countries) that the LabCorp laboratory reference range for ferritin is 30 to 400 ng/mL. The reference range for any laboratory test makes the basic flawed assumption that most individuals in a group will have a normal result and that the reference range will serve to contain the majority of normal individuals in a given population. However, when a condition or deficiency affects nearly everyone being tested, the reference range has no direct bearing on normalcy at all. As will be shown, the LabCorp ferritin reference range actually encompasses NO normal levels, as any ferritin measurements above 25 ng/mL mark the beginnings of excess iron accumulation. The true “normal” range for ferritin, which is seen mostly in children and younger menstruating females, runs roughly from 15 to 25 ng/mL, although lower levels can still be normal when no anemia is present.

A ferritin level of 50 ng/mL is considered to be normal by many physicians, with some of them even regarding such a level as being too low. Phlebotomy (blood donation) has been clearly established to reduce iron stores in the body and lessen laboratory parameters of lipid peroxidation and oxidative stress, decreasing the incidence of heart disease in the process. [43-45] Studies looking at reduced iron stores after blood donation show that ferritins of 50 ng/mL, while not drastically elevated, nevertheless are clearly associated with a deterioration of an important vascular function. The ability of the blood vessels to dilate (or relax) normally was shown to be clearly better in blood donors who averaged ferritin levels of 17 ng/mL versus those with ferritin levels at 52 ng/mL. [46] A similar finding was seen in another study comparing the ease of arterial dilation with ferritin levels reduced well below 50 ng/mL. [47] Loss of easy dilation is an early finding in patients who develop atherosclerosis and other vascular conditions. Any increases in cellular or circulating free iron quickly increases oxidative stress, which is the primary reason for the endothelial dysfunction and the impaired vascular relaxation. [48,49]

The relationship between heart disease and increased ferritin levels is especially well-established. In both men and women, elevated serum ferritin is both independently and positively associated with coronary artery disease. [50-52] Even a greater incidence of heart disease is seen in men and postmenopausal women compared with the incidence in premenopausal women, as the regular loss of blood (and iron) stops at menopause. [53]

A similar reduction in the incidence of new cancers and cancer-related deaths is seen with serial phlebotomies over a six-year period. The average ferritin level in 23 individuals dying of cancer was 136 ng/mL, while the 77 survivors had an average ferritin level of 84 ng/mL. [54] Another study revealed that reducing iron stores by phlebotomy clearly reduced risk of cancer and cancer-related mortality. [55] As it turns out, the serum levels of all three of the toxic nutrients (calcium, iron, and copper) are significantly increased compared to levels in non-cancerous control patients. [56]

Now for the shocker. Since the 1940s the routine “enrichment” of flours, cereals, and grains with iron (along with some B vitamins) began on the premise that the wartime populations of the United States and the United Kingdom were being rationed food and the overall availability of some important nutrients were felt to be lacking. In 1942, the U.S. Army decided to purchase only flour that was enriched. This quickly led to much of the world following in lockstep with the United States. And once it started, it never stopped, continuing to this very day.

One big problem with this is that nobody in the United States eating even the most faddish of diets has an iron deficiency, and no additional iron of any kind does anything but inflict body-wide oxidative damage to the consumer. The second problem is that the form of iron being added much of the time to these enriched foods is in the form of metallic iron filings. Somehow our public health authorities have decided that eating pure metal is the best way to keep from developing deficiencies in compounds related to that metal in the body. Obviously, for the ingestion of iron (or any other metal that forms different compounds), the iron/metal must first be metabolized in a plant to produce consumable forms of that metal. Bear in mind that any form of additional iron is not good for you, but consuming iron in its metallic form is especially effective in causing the daily oxidative stress exposure in the gut to skyrocket. As a picture is worth a thousand words, and a video worth even much more, please take a few minutes to view this brief video, filmed roughly 30 years ago now (and nothing has changed since, as it is very easy for anyone to reproduce it). https://www.youtube.com/watch?v=HGbwFtmJOi4&t=75s

The only time iron should ever be deliberately ingested (and in a proper medical formulation, never as an unrefined metal!!) is when a low ferritin is seen AND a hypochromic, microcytic anemia (pale and small red blood cells) is present. And once the hemoglobin level is back to normal, the iron prescription should be stopped. Iron should never be taken to “protect” against developing an iron deficiency anemia. Excess blood loss, whether from excess menses or a loss of blood from a gastrointestinal tract cancer, is nearly always the reason for the presence of a low ferritin and a hypochromic, microcytic anemia. And while a nutritional deficiency of iron is very common in third world countries, it is exceptionally rare in the United States. [57] Short of frank chronic starvation, dietary iron deficiency simply does not exist.

Incredibly, some of this outrage over adding metallic iron filings might finally be filtering back to the governmental agencies in charge of continuing this process, as the internet now has websites that discuss “food grade” iron filings, as if the consumption of “contaminated” iron filings is the real problem. Some misguided pundits assert that the acid in the stomach dissolves the metallic iron and allows it to be absorbed and assimilated. However, basic chemistry does not support such an assertion. Metallic iron + HCl (hydrochloric acid) converts to ferric chloride, a form of iron that is exceptionally toxic, corrosive, and acidic. [58,59] Supplemental iron commonly used is in the form of ferrous sulfate, not ferric chloride.

As all disease results from the excess oxidation of biomolecules, it can readily be appreciated how negative the ingestion of metallic iron filings will be on overall gut health and function, especially when it is done for a lifetime. No substance promotes excess oxidation wherever it is found more than free iron in its ferric form. When ingested chronically in its metallic form, it can directly cause a foreign body reaction when the metal fragments are large enough. Furthermore, as more ferric chloride is formed from the HCl in the stomach, the stage is set for the nonstop provocation of chronic inflammation in the gut. This can manifest itself clinically in just about all forms and presentations of gut dysfunction, including leaky gut and a pathogen-overgrown microbiome. Of note, Helicobacter pylori, the pathogen now considered to be the causative agent for many cases of ulcer disease in the stomach and small intestine, thrives optimally where iron is most plentiful. [60,61]

Food allergies were largely unheard of before the widespread poisoning of enriched food with metallic iron filings. As of 1971, about 100 articles addressing food allergy were published on PubMed. Now, in 2023, the phrase “food allergy” yields over 12,000 articles. Nearly all of this can be blamed on the widespread presence of the leaky gut syndrome, for which the incessant ingestion of metallic iron can be considered a major cause. Peanuts and gluten contain proteins that digest well when the gut is intact. However, when undigested segments of peanut or gluten protein make their way into the lymphatics and blood, severe allergic and autoimmune reactions are to be anticipated. In the 1950s and 1960s a peanut butter and jelly sandwich was never the potentially fatal snack that it is today for so many children.

For many adults today, their iron assault began in infancy. For many reasons breast feeding is the healthiest way to feed a newborn. However, when electing to bottle-feed, realize that it is extremely difficult to find an infant formula free of added iron. The ingestion of this excess iron from the start of life is a major reason why so many people have never had a truly normal, well-formed bowel movement in their lives. Supermarkets and drug stores offer a massive array of pills and potions for every conceivable bowel/digestive problem, a clear visual reminder of how widespread digestive disorders are.

One of the reasons that organic and gluten-free food products are supportive of good health is that they do not have iron added to them. At least, that is the case with probably 90% or so of them. The ingredient labels still need to be carefully read, since even these food products are occasionally contaminated with extra iron as well.

Many factors play a role in achieving and maintaining good gastrointestinal health, which is incredibly important as compromised gut health negatively impacts any other medical conditions throughout the body. A major step toward good gut health throughout the population will be taken when metallic iron filings are no longer a regular part of so many diets throughout the United States and the rest of the world.

Recap

Most vitamins, minerals, and other nutritional supplements can be pushed to very high degrees of intake without resulting in any significant clinical toxicity. However, this is not always the case, and it is very important to be aware of the circumstances under which minimal added intake of some supplements can be devastating to achieving and maintaining long-term health.

Except under the very limited scenarios described above, calcium, iron, and copper should never be supplemented. And in the case of calcium, a high daily intake of some dairy products can be very harmful as well. All three of these “toxic nutrients,” while absolutely necessary at low levels for the health of all cells, rapidly become major weapons in bringing down the health of those same cells with only MINIMAL degrees of added intake.

Iron intake is especially problematic, as so many people are getting continually “supplemented” with iron, typically in the form of metallic iron filings, whenever they eat any of a large variety of common foods.

All three of these nutrients are some of the most common, yet still almost completely unrecognized, causes of death by heart disease and cancer.

(Author of this article, Dr. Thomas Levy can be contacted at televymd@yahoo.com)

(The views expressed in this article are the author’s and do not necessarily reflect the opinions of the Orthomolecular Medicine News Service or all members of its Editorial Board. OMNS invites alternative viewpoints. Submissions may be sent directly to Andrew W. Saul, Editor, at the email contact address further below.)

References

  1. Levy T (2013) Death by Calcium: Proof of the toxic effects of dairy and calcium supplements, Henderson, NV: MedFox Publishing. To download a complimentary eBook: https://dbc2.medfoxpub.com/
  2. Michaelsson K, Melhus H, Lemming E et al. (2013) Long term calcium intake and rates of all cause and cardiovascular mortality: community based prospective longitudinal cohort study. BMJ 346:f228. PMID: 23403980
  3. Bolland M, Avenell A, Baron J et al. (2010) Effect of calcium supplements on risk of myocardial infarction and cardiovascular events: meta-analysis. BMJ 341:c3691. PMID: 20671013
  4. Shreya D, Zamora D, Patel G et al. (2021) Coronary artery calcium score-a reliable indicator of coronary artery disease? Cureus 13:e20149. PMID: 35003981
  5. Eghtedari B, Kinninger A, Roy S, Budoff M (2023) Coronary artery calcium progression and all-cause mortality. Coronary Artery Disease 34:244-249. PMID: 37102229
  6. Sribnick E, Del Re A, Ray S et al. (2009) Estrogen attenuates glutamate-induced cell death by inhibiting Ca2+ influx through L-type voltage-gated Ca2+ channels. Brain Research 1276:159-170. PMID: 19389388
  7. Fonseca M, Almeida-Pititto B, Bittencourt M et al. (2022) Menopause per se is associated with coronary artery calcium score: results from the ELSA-Brasil. Journal of Women’s Health 31:23-30. PMID: 34520264
  8. Hall J, Jones R, Jones T et al. (2006) Selective inhibition of L-type Ca2+ channels in A7r5 cells by physiological levels of testosterone. Endocrinology 147:2675-2680. PMID: 16527846
  9. Jiang R, Sui Y, Hong J et al. (2023) The combined administration of vitamin C and copper induces a systemic oxidative stress and kidney injury. Biomolecules 13:143. PMID: 36671529
  10. Xu W, Mao Z, Zhao B et al. (2021) Vitamin C attenuates vancomycin induced nephrotoxicity through the reduction of oxidative stress and inflammation in HK-2 cells. Annals of Palliative Medicine 10:1748-1754. PMID: 33302636
  11. Stadler N, Lindner R, Davies M (2004) Direct detection and quantification of transition metal ions in human atherosclerotic plaques: evidence for the presence of elevated levels of iron and copper. Arteriosclerosis, Thrombosis, and Vascular Biology 24:949-954. PMID: 15001454
  12. Myint Z, Oo T, Thein K et al. (2018) Copper deficiency anemia: review article. Annals of Hematology 97:1527-1534. PMID: 29959467
  13. Tahir N, Ashraf A, Waqar S et al. (2022) Copper deficiency, a rare but correctable cause of pancytopenia: a review of literature. Expert Review in Hematology 15:999-1008. PMID: 36314081
  14. Van den Berg G, Beynen A (1992) Influence of ascorbic acid supplementation on copper metabolism in rats. The British Journal of Nutrition 68:701-715. PMID: 1493135
  15. Pekiner B, Nebioglu S (1994) Effect of vitamin C on copper and iron status in men and guinea pigs. Journal of Nutritional Science and Vitaminology 40:401-410. PMID: 7891201
  16. Harris E, Percival S (1991) A role for ascorbic acid in copper transport. The American Journal of Clinical Nutrition 54(6 Suppl):1193S-1197S. PMID: 1962569
  17. Stone I (1979) Homo sapiens ascorbicus, a biochemically corrected robust human mutant. Medical Hypotheses 5:711-721. PMID: 491997
  18. Torti S, Manz D, Paul B et al. (2018) Iron and cancer. Annual Review of Cancer 38:97-125. PMID: 30130469
  19. Shanbhag V, Gudekar N, Jasmer K et al. (2021) Copper metabolism as a unique vulnerability in cancer. Biochimica et Biophysica Acta. Molecular Cell Research 1868:118893. PMID: 33091507
  20. Morales M, Xue X (2021) Targeting iron metabolism in cancer therapy. Theranostics 11:8412-8429. PMID: 34373750
  21. Ge E, Bush A, Casini A et al. (2022) Connecting copper and cancer: from transition metal signalling to metalloplasia. Nature Reviews. Cancer 22:102-113. PMID: 34764459
  22. Tang X, Yan Z, Miao Y et al. (2023) Copper in cancer: from limiting nutrient to therapeutic target. Frontiers in Oncology 13:1209156. PMID: 37427098