True Vitamin A For Health And Happiness

“The discovery of vitamin A and the history of its application in the field of human nutrition is a story of bravery and brilliance, one that represents a marriage of the best of scientific inquiry with worldwide cultural traditions; and the suborning of that knowledge to the dictates of the food industry provides a sad lesson in the use of power and influence to obfuscate the truth”
~Mary Enig, PhD, Lipid Biochemist

Over this past century, there has been a developing insight into the role of nutrition in health. It was originally motivated by the observations of the diseases of malnutrition, largely coinciding with the diseases of civilization. This became increasingly obvious with industrialization. By the mid-20th century, there was a growing health movement that that brought greater awareness to this field of study.

When my grandmother was diagnosed with cancer in the 1980s, she had already for decades been reading books on health and so, instead of chemotherapy or radiation, she tried to cure herself with a macrobiotic diet. My father recalls her juicing such vast amounts of carrots, presumably to up her beta-carotene levels, that her skin turned the same color as her beverage of choice. That is the reason egg yolks and butter are yellow and turn an even deeper orange when animals are pasture-raised on lush green forage (beta-carotene is easily oxidized and so, once cut, grass quickly loses its amount of this nutrient; hence, cattle eating their fresh greens in the spring and summer is important for the fat-soluble vitamins they store in their fat over winter). It is the carotenoids that cause that and there are “currently about 600 known forms of naturally occurring carotenoids” (Sarah Pope, Busting the Beta Carotene Vitamin A Myth). “The carotenoids are further broken down into 2 classes, carotenes and xanthophylls. The carotenes consist of alpha-carotene, beta-carotene, gamma-carotene, delta-carotene, epsilon-carotene, and zeta-carotene. In the xanthophyll class we have astaxanthin, beta-crypto-xanthin, canthaxanthin, fucoxanthin, lutein, neoxanthin, violaxanthin, and zeaxanthin” (Casey Thaler, Why You Need Vitamin A). Beta-carotene is the precursor to the fat-soluble vitamin A and it is linked to the body’s immune system, in fighting off cancerous cells and other sicknesses, along with handling stress (yet “Stress conditions, such as extremely hot weather, viral infections, and altered thyroid function, have also been suggested as causes for reduced carotene to vitamin A conversion”; from Lee Russell McDowell’s Vitamins in Animal Nutrition, p. 30) — Weston A. Price was an early 20th century researcher who observed the relationship between fatty animal foods, fat-soluble vitamins, and a strong immune system.

Price also discussed fertility, and I’d note that research shows that vegans and vegetarians have higher rates of infertility, as do rodents with vitamin A deficiency. I know a vegetarian couple that spent years trying to get pregnant and then spent thousands of dollars on in vitro fertilization treatments, trying twice and finally getting pregnant (the guy’s sperm were deformed and didn’t have proper motility). Pam Schoenfield states that, “in the worst case, spontaneous abortion or death of mother and offspring during labor, as described by Weston A. Price. In humans, even mild deficiencies during pregnancy can lead to compromised kidney development in the child” (The Unfair Stigmatization of Vitamin A). And Nora Gedaudas offers a grim warning that, “Vitamin A deficiency has even been implicated in Sudden Infant Death Syndrome (SIDS)! Far from being a threat to any unborn fetus, The Nordic Epidemiological SIDS Study found an association between low or no vitamin A intake and an increased risk of sudden infant death syndrome (SIDS) during their first year of life. This finding remained conclusive even when an adjustment was made for potential confounders, including socioeconomic factors. Furthermore, substantial evidence exists to show that healthy vitamin A levels during pregnancy in the mother may results in substantially reduced HIV transmission risk to her unborn child (also, vitamin A-deficient mothers were much more likely to transmit the virus to their newborn infants than were HIV-infected mothers who had adequate amounts of this critical nutrient)” (Vitamin A Under Attack Down Under). By the way, see the works of Ken Albala and Trudy Eden (e.g., Food and Faith in Christian Culture) about how in Galenic theory of humors it is understood that meat, especially red meat of ruminants, increases ‘blood’ which turns into semen and breast milk, that is to say fatty animal foods are good for fertility and fecundity, for growth, strength and vigor (Galenic ‘blood’ might be translated as animal spirits, life force, prana, or, in more modern terminology, libido) — it is true that ruminants appear to have been the primary food source in human and hominid evolution, specifically the blubbery ruminants that we call the megafauna that hominids prized for millions of years before their die-off.

Many animals are able to turn beta-carotene into retinol, something the human body theoretically can do to a limited extent if not as effectively as ruminants, and the final product is concentrated in the animal fat and the liver. As carotenoids are why carrots and sweet potatoes are orange, this relates to why egg yolks and butter is yellow-to-orange but not egg whites and skim milk. The deep color doesn’t only tell you the presence of vitamin A but the other fat-soluble vitamins as well: D, E, and K (the light color or even whiteness of the fat in factory-farmed animal foods is a sign of nutritional low quality). In my grandmother’s compromised health, she likely was not producing enough of her own vitamin A, despite getting a bounty of the precursor (her oddly-colored skin probably indicated carotenaemia, the condition of the body not metabolizing carotenoids). Worse still, beta-carotene in some cases is associated with increased risk of cancer: “In two studies, where people were given high doses of B-carotene supplements in an attempt to prevent lung cancer and other cancers, the supplements increased risk of lung cancer in cigarette smokers, and a third study found neither benefit nor harm from them. What might cause the unexpected findings? While beneficial at low doses, at higher doses, antioxidants can shut down cell signalling pathways and decrease synthesis of mitochondria in new muscle cells. They can also decrease production of endogenous antioxidants produced by a body” (Fred Provenza, Nourishment, p. 99). In getting unnaturally high levels of beta-carotene from juicing large quantities of carrots, was my grandmother overdosing herself in a similar manner? Rather than improving her immunity, did it contribute to her dying from cancer despite doing her best to follow a healthy diet? More is not necessarily better, especially when dealing with these precursors. It would be better to go straight to the preformed vitamin A in naturally balanced ratio with other nutrients (e.g., organ meats).

Beyond cancer concerns, the bioavailable forms of retinoids, along with nutrients like B12 that also are only found in animal foods (liver has high levels of both), have a major effect on the health and development of eyes (during the British food shortage of World War II, the government promoted eating the surplus of carrots by having told the public that it would give them the eyesight of heroic fighter pilots). Night blindness, a common symptom of vitamin A deficiency, is one of the most widespread problems. It is associated with developing countries and impoverished communities, but malnourishment creeps into many other populations. A vegetarian I know is experiencing loss of his night vision and it stands out that the others in his family who are also vegetarian show signs of deficiencies — besides the family being regularly sick, his wife has severe calcium loss (her body is literally absorbing her lower jaw bone) and his kids have issues with neurocognitive development and mental health (autism, depression, etc). The family doesn’t only avoid meat for neither do they eat much dairy or eggs, for example preferring plant ‘milks’ over real milk. This is something now recommended against for the young, instead advising children either drinking dairy milk or else water (E. J. Mundell, New Kids’ Drink Guidelines: Avoid Plant-Based Milks). This makes me wonder about my own early development because, after being weaned early at 6 months, I couldn’t handle cow milk and so was put on soy milk. This likely was a contributing factor to my early diagnosed learning disability, autistic-like symptoms, depression, and poor eyesight (I got corrective lenses a year or two after reading delays led me to going to a special education teacher; I was an otherwise healthy kid who played sports and spent a lot of time outside, rather than watching tv; as an interesting anecdote from someone else, see what Josiah shares in The Story of How Real Vitamin A Changed My Life). To later compound the fat-soluble vitamin deficiency, my mother in following expert health advice bought skim milk during my adolescent growth period and, now that I think about it, that was around when my depression really started going into full gear. This kind of thing hasn’t been a problem for traditional societies that often breastfed for the first 2-3 years, and that mother’s milk would be full of fat-soluble vitamins assuming the mother was eating well such as getting fatty animal foods from wild or pasture-raised sources (a safe assumption to make as long as they still have access to their traditional foods in maintaining traditional hunting grounds, fishing waters, or grazing lands).

The diseases of vitamin A deficiency have been known for millennia and were typically treated with fatty animal foods such as ruminant liver or cod liver oil, but other parts of the animal were also used. “In his pioneering work, Nutrition and Physical Degeneration, Weston Price tells the story of a prospector who, while crossing a high plateau in the Rocky Mountains, went blind with xerophthalmia, due to a lack of vitamin A. As he wept in despair, he was discovered by an Indian who caught him a trout and fed him “the flesh of the head and the tissues back of the eyes, including the eyes.”1 Within a few hours his sight began to return and within two days his eyes were nearly normal. Several years previous to the travels of Weston Price, scientists had discovered that the richest source of vitamin A in the entire animal body is that of the retina and the tissues in back of the eyes” (Sally Fallon Morell & Mary G. Enig, Vitamin A Saga). I might add that there is more to eyeball than just vitamin A: “The Latin name for the retina of the eye is macula lutea. ( Lutea is Latin for yellow. ) This thick, membranous yellow layer of the eyeball is a rich source of the nutrient lutein, a member of the retinoid family of vitamin A precursors. Lutein supplements are now promoted as being good for prostate health and for preventing macular degeneration. The fat behind the eyeball is a rich source of vitamin A and lutein. (If you think you’d rather swallow a supplement than pop an eyeball after breakfast, remember that vitamins are heat-, light-, and oxygen-sensitive and unlikely to survive processing.) And while you’re digesting the idea of eating eyeball fat, consider that the gooey juice in the eye is primarily hyaluronic acid, rich in glycosaminoglycans. You can get hyaluronic acid injected into your lips (to fill them out), your knee (as a treatment for osteoarthritis), and even your own eye (to treat certain ocular diseases) for $200 a dose (twenty one-thousandths of a gram). It’s called Restylane. But you can get this useful nutrient into your body just by eating the eyes you find in fish head soup, and the glycosaminoglycans will find their way to the parts of the body that need them most” (Catherine Shanahan, Deep Nutrition, p. 279). Maybe that is a home remedy my grandmother should have tried.

Despite this old wisdom, vitamin A itself was not identified until the early 1900s. In the decades following, all of the other main vitamins were discovered. Gyorgy Scrinis, in his book Nutritionism, summarizes this early history of nutritional studies that led to the isolation of the chemical structure (p. 75): “It was also in 1912 that Elmer McCollum’s research team at Yale University first identified a fat-soluble substance they called fat-soluble A or Factor A (later renamed vitamin A) found in butter, liver, and leafy greens. Through his experiments on rats, McCollum demonstrated that a deficiency of Factor A led to impaired vision and stunted growth. The team also identified “water-soluble B” or “Factor B,” later renamed vitamin B, the absence of which they linked to the tropical disease beriberi. 80 In the 1920s scientists identified various important relationships between vitamins and human health: they linked deficiency in vitamin C, which they found in citrus fruits, to scurvy, and they linked deficiency in vitamin D, found in various foods and produced by the body in response to sunlight, to rickets. During the 1920s and 1930s, other vitamins and minerals were identified, including riboflavin, folic acid, beta-carotene, vitamin E, and vitamin K.” McCollum was motivated by a realization of some important factor that was missing. “After reviewing the literature between 1873 and 1906,” writes Lee Russell McDowell, “in which small animals had been fed restricted diets of isolated proteins, fats, and carbohydrates, E. V. McCollum of the united States noted that the animals rapidly failed in health and concluded that the most important problem in nutrition was to discover what was lacking in such diets” (Vitamins in Animal Nutrition, p. 9). One of the limitations of the early research was that it was too broad of an approach. In isolated macronutrients failing to serve optimal health, researchers turned to isolated micronutrients, not considering that part of the problem is the isolating of any particular nutrient and ignoring the larger process of how nutrients get used as part of whole foods. The precursor provitamin A or carotenoids such as beta-carotene is not the same as preformed vitamin A as retinoids in the form of retinol and its esterified form, retinyl ester, and also as retinal and retinoic acid. To conflate the carotenoids and the retinoids is reductionist and, when accepted without question, causes many problems. Nutrients aren’t simply a compound that either one gets or doesn’t. There is more going on than that.

The main complication is that the body doesn’t easily turn the one form into the other and this can vary greatly between individuals: “many people are genetically bad converters of carotenoids to retinol” (Dr. Chris Masterjohn: Why You’re Probably Nutrient Deficient – The Genius Life); for brief discussions of related genes, see Debbie Moon’s How Well Do You Convert Beta-Carotene to Vitamin A?, Joe Cohen’s The Importance of Real Vitamin A (Retinol), and David Krantz’s When Carrots Don’t Cut It: Your Genes and Vitamin A. Also consider: “This genetic problem may exist in up to half of the population. Its presence appears to be associated with high blood levels of beta-carotene, alpha-carotene, beta-cryptoxanthin, and low levels of lycopene, lutein and zeaxanthin—three other carotenoids important for health but don’t convert to vitamin A” (Phil Maffetone, Vitamin A and the Beta-Carotene Myth: “A” is for Athletics, Aging and Advanced Health). Keep in mind that other nutrients such as “Iron and zinc deficiency can affect the conversion to vitamin A” (Pam Schoenfield, The Unfair Stigmatization of Vitamin A).  It’s ironic that it turns out, with all the obsession over eating loads of the precursor from vegetables and supplements, that the very beta-carotene that potentially can be made into vitamin A might also compete with it: “Recent research also suggests that cleavage products of beta-carotene can block vitamin A at its receptor sites—another possible anti-nutrient?” (Schoenfield). So, even nutrient-density of a vitamin precursor doesn’t necessarily mean there is bioavailability of the vitamin itself. “Conversion of carotenes to Vitamin A takes place in the upper intestinal tract in the presence of bile salts and fat-splitting enzymes. While early studies on this biological process suggested a 4:1 ratio of beta carotene to Vitamin A conversion, later studies revised this to 6:1 or perhaps even higher. If a meal is lowfat, however, not much bile is going to reach the intestinal tract further worsening the conversion ratio of carotenes to Vitamin A” (Sarah Pope, Busting the Beta Carotene Vitamin A Myth). “Average conversion of beta-carotene to retinol is around 2-28% (28% is on the very generous end), meaning those who consume all of their un-converted vitamin A from plants would have a very hard time meeting their vitamin A needs, and conversion could be even lower if someone is in poor health” (Laura A. Poe, Nutrient Spotlight: Vitamin A). There are numerous health conditions and medications that make conversion difficult or impossible — for example: “Diabetics and those with poor thyroid function, (a group that could well include at least half the adult US population), cannot make the conversion. Children make the conversion very poorly and infants not at all — they must obtain their precious stores of vitamin A from animal fats— yet the low-fat diet is often recommended for children” (Sally Fallon Morell, Vitamin A Vagary). I could list many other factors that get in the way, as this conversion process requires almost perfect conditions within the human body.

If you’re assuming that you are getting enough vitamin A, you’re making a dangerous gamble with your health. Take heed of a recent case where a teenager lost his eyesight and hearing after a decade of avoiding nutritious animal foods and instead having followed a junk food diet (Lizzie Roberts, Teenager ‘first in UK’ to go deaf and blind due to junk food diet, report reveals) — “The doting mother believes vitamin A injections could have saved Harvey’s sight if they were given to him at an earlier age” (Kyle O’Sullivan, Mum of teenager who went blind from only eating crisps and chocolate blames NHS); she said that, “Back in December when we were told it was down to nutrition, we think if they’d done the blood test then and realised the Vitamin A was so low they could have given him the Vitamin A injections then and he could see a lot more out of that right eye and we could have saved it a lot better.” Of course, he wasn’t eating a ‘balanced’ diet, but like many people on modern nutrient-deficient diets he became dependent on government-enforced supplementation policies saving him from malnourishment. But it turns out that, even with fortified foods, there are major nutritional holes in the Western diet. Should we fortify food even further or maybe genetically-modify crops for this purpose? Should we eat even more vegetables and juice them until, like my grandmother, we all turn orange? Or else should we simply go back to eating healthy traditional animal foods?

This is the same basic issue with the precursors of other fat-soluble vitamins and the precursors of other nutrients that aren’t easily processed by humans until after other animals have done the work for us (e.g., the omega-3s in algae can’t be accessed by human digestion but fish can break it down and so, in eating fish, it becomes bioavailable). This understanding has been slow to take hold in nutrition studies. Consider how, even though Weston A. Price was writing about Activator X in the 1940s, it wasn’t until this new century that it was identified as vitamin K2 and identified in being distinct from vitamin K1 —  see Christopher Masterjohn, On the Trail of the Elusive X-Factor: A Sixty-Two-Year-Old Mystery Finally Solved. By the way, I’d emphasize the close link of vitamin A and vitamin K, as Masterjohn details: “Because vitamin K1 is directly associated with both chlorophyll and beta-carotene within a single protein complex and plays a direct role in photosynthesis,13 the richness of the green color of grass, its rate of growth, and its brix rating (which measures the density of organic material produced by the plant) all directly indicate its concentration of vitamin K1. Animals grazing on grass will accumulate vitamin K2 in their tissues in direct proportion to the amount of vitamin K1 in their diet. The beta-carotene associated with vitamin K1 will also impart a yellow or orange color to butterfat; the richness of this color therefore indirectly indicates the amount of both vitamins K1 and K2 in the butter. Not only are the K vitamins detected by the Activator X test and distributed in the food supply precisely as Price suggested, but, as shown in Figure 2, the physiological actions that Price attributed to Activator X correspond perfectly to those of vitamin K2. It is therefore clear that the precursor to Activator X found in rapidly growing, green grass is none other than vitamin K1, while Activator X itself is none other than vitamin K2.” Just eat those delicious animal foods! Then everything will be right with the world. Any ideology that tells you to fear these foods is a belief system that is anti-human and anti-life.

The thing is, in their natural form in animal foods, fat-soluble vitamins are part of a complex of synergistic nutrients and their cofactors (it’s particularly important for vitamins A, D3, and K2 to be in balance). Isolated vitamins, especially in higher amounts as supplements and fortification to treat disease, have sometimes proven to be problematic for health in other ways. “Nutrient supplements may even be harmful, particularly when taken in large, concentrated, and isolated doses,” explained Gyorgy Scrinis. “An overdose of vitamins A and D, for example, can have toxic and potentially fatal effects. 85 Some studies have also found an association between beta-carotene supplements and an increased risk of both cardiovascular disease and certain cancers” (p. 76). Furthermore, specific foods are part of a total diet and lifestyle. For hundreds of thousands of years, humans ate a low-carb and high-fat diet combined with regular fasting, which guaranteed regular ketosis and autophagy. There have been numerous health benefits shown from these combined factors. It’s fascinating that, in early research on the ketogenic diet when applied to children, that it was sometimes observed to not only to be effective in treating physical diseases like epileptic seizures and diabetes but that behavioral issues also improved. This has been demonstrated with more recent research as well in showing the diverse connections to neurocognitive health (Ketogenic Diet and Neurocognitive HealthFasting, Calorie Restriction, and Ketosis, The Agricultural Mind, “Yes, tea banished the fairies.”, Autism and the Upper Crust, Diets and SystemsPhysical Health, Mental Health). Still, there are many confounding factors. Since ketogenic diets tend to increase fat intake, depending on the source, they also can increase the levels of fat-soluble vitamins. Weston A. Price observed that traditional healthy societies getting plenty of these nutrients had both greater physical health (well-developed bone structure, strong immune system, etc) and greater ‘moral’ health (positive mood, pro-social behaviors, etc). The moral panic that more strongly took hold in the 19th century was understood at the time as being rooted in general health concerns (The Crisis of Identity).

Many of the fat-soluble vitamins, especially vitamin A, act more like hormones than mere nutrients. They influence and determine nearly every system in the body, including in how they impact the development of nervous system, brain, and gut-brain axis. If one is seeing outward symptoms like eye deterioration, it can be guaranteed that far worse problems are already forming that are less obvious. Yet even outward symptoms aren’t always recognized, such as in one study where most subjects didn’t even realize they had decreased night vision from vitamin A deficiency. Our health often has to get quite severely bad before we notice it and admit to it. Part of this is that sickness and disease has become so common that it has become normalized. I was socializing with someone who is overweight to an unhealthy degree, but I later realized how the excess body fat didn’t even stand out to me because, by American standards, this person was normal. Anything can become normalized. My mother brought my nephew to the doctor and, in discussing how often he gets sick, told the doctor about his unhealthy diet. The doctor’s response is that all kids eat unhealthy. The doctor has seen so many sickly patients that she couldn’t imagine health as a normal state of the human body. About the vegetarian I mentioned, I give him credit for at least noticing his loss of night vision, but what was nagging me about his situation is how he seems to have accepted it as an expected part of aging, in not realizing that it is a sign of a severe health concern. He probably has never thought to mention it to his doctor and, even if he did, most doctors aren’t well enough educated in nutrition to realize its significance and understand what it might mean (Most Mainstream Doctors Would Fail Nutrition).

This isn’t a problem limited to a few people. One of the main things that has declined over time is access to fat-soluble vitamins, a pattern that most clearly emerged with the early 19th cheap grains, especially white flour, replacing animal foods and then ratcheted up further with the early 20th century replacement of animal fats with industrial seed oils. It’s not only whether or not we are getting a particular vitamin. As important is how other aspects of the diet affect nutrition. A high-carb diet itself might be disrupting the bioavailability of vitamin A, and that is even more true if also low-fat since the fat-soluble vitamins are useless without the fat to absorb them. From the book Malnutrition and the Eye, Donald McLaren writes: “The association of xerophthalmia with an excessive intake of carbohydrate in the diet in infancy was recorded by Czerny and Keller (1906) in their classical monograph on the syndrome they termed Mehlnahrschaden. It is now recognized that this condition is identical in all basic features to what has been called “the most serious and widespread nutritional disorder known to medical and nutritional science” (Brock and Autret, 1952) and due in essence to a deficiency of protein and excess of carbohydrate in the diet. Many local and other names have been applied to this disease but it will be necessary here to use one, and that chosen, “kwashiorkor,” has found wide acceptance. Since Cerny’s day there has been a great number of other accounts in which ocular involvement has been described (McLaren, 1958), providing good evidence for the contention that a deficiency of vitamin A is the most common of all vitamin deficiencies associated with kwashiorkor.” It has been observed by many that the populations with vitamin A deficiency tend to have a diet high in grains and vegetables while low in animal foods, particularly low in seafood (fish oil is the most concentrated source of vitamin A). A high grain diet effects nutrition in other ways as well: “Many nutritionists consider cereal grains to be good sources of most of the B vitamins except for vitamin B12. Inspection of table 4 generally is supportive of this concept, at least in terms of the % RDA which cereal grains contain. However, of more importance is the biological availability of the B vitamins contained within cereal grains and their B vitamin content after milling, processing and cooking. It is somewhat ironic that two of the major B vitamin deficiency diseases which have plagued agricultural man (pellagra and beriberi) are almost exclusively associated with excessive consumption of cereal grains” (Loren Cordain, “Cereal Grains: Humanity’s Double-Edged Sword,” from Vitamin History: The Early Years ed. by Artemis P. Simopoulos, p. 27). Besides vitamin A and B12 affecting eye and bone health, they work together in numerous other ways (e.g., Edward G. High & Sherman S. Wilson, Effects of Vitamin B12 on the Utilization of Carotene and Vitamin a by the Rat), as is true with so many other links between nutrients. The balance is easily disturbed — all the more reason to worry about grains since they knock out large swaths of essential nutrients, including the nutrients from animal foods. Eat the hamburger and leave the bun, enjoy the steak but not the roll. There is another factor to keep in mind. A high-carb diet is a major cause of liver disease, related to metabolic syndrome that is also associated with insulin resistance, obesity, diabetes, heart disease, Alzheimer’s, etc. The liver is necessary for the digestion of fat and the assimilation of fat-soluble vitamins. So, if someone on a high-carb diet has compromised their liver functioning, they can be deficient in vitamin A no matter how much they are getting from their diet (James DiNicolantonio made a similar point about other nutrients: “The liver makes proteins that carry minerals around the body. So if you have liver disease, even if you consume enough minerals, you may have difficulty moving minerals around the body to where they are needed”; this would relate to how the fat-soluble vitamins are absolutely necessary for the absorption, processing, transportation, and use of minerals).

This is exacerbated by the fact that, as people have followed official dietary recommendations in decreasing fat intake, they’ve replaced it with an increase in starchy and sugary carbohydrates. Actually, that isn’t quite correct. Fat intake, in general, hasn’t gone down (nor gone up). Rather, Americans are eating less animal fats and more industrial seed oils. The combination of unhealthy carbs and unhealthy oils is a double whammy. There are all kinds of problems with industrial seed oils, from being oxidative to being inflammatory, not to mention being mutagenic (Dr. Catherine Shanahan On Dietary Epigenetics and Mutations). On the one hand, there is the loss of fat-soluble vitamins in that the industrial seed oils lack them. That is bad enough, but consider another part of the equation. Those same oils actively interfere with what fat-soluble vitamins that are otherwise being obtained — as Gryorgy Scrinis tells it: “The nutritional engineering of foods can create nutrient-level contradictions, whereby the enhancement or removal of a particular nutrient by food manufacturers interferes with the quantities or absorption of other desirable nutrients. For instance, studies have shown that the concentrated quantities of plant sterols in cholesterol-lowering margarines block the absorption of beta-carotene and therefore lower vitamin A levels in the body. 27 Consumers of sterol-enriched foods are therefore encouraged to compensate for this nutrient-level contradiction by eating more fruits and vegetables to increase their vitamin A levels” (Nutritionism, p. 211). Combine that with health problems with the gut, metabolism, and liver as seen with so many modern Westerners and other industrialized populations (88% of adult Americans are metabolically unfit: Dietary Health Across Generations). Maybe humans with the right genetics and optimal health could both turn beta-carotene into retinol and make use of all the fat-soluble vitamins, as precursors or preformed. The problem is that doesn’t describe most people today. “Unfortunately, just like with Omega 3, the evidence indicates our ability to convert beta-carotene to retinol is not sufficient (202122). Some estimates indicate that beta-carotene intake is only 16-23% as effective as retinol intake for increasing body levels of retinol (20, 21, 22). This is supported by findings that beta-carotene supplementation and high beta-carotene intake (vitamin A from vegetables) increases serum beta-carotene levels but does not significantly impact retinol levels (20, 21, 22)” (Andy AKA Barefoot Golfer, Why Humans are Not Vegetarians – The Omnivorous Truth).

This is why vegans and vegetarians so easily get into trouble with nutrient deficiencies and are forced to rely upon supplements, although its questionable how helpful are these supplements as replacements for whole foods as part of what would otherwise be a diet that is both nutrient-dense and nutrient-bioavailable. The vegetarian I discussed above eats a “balanced diet” of fresh produce and fortified plant foods combined with a multivitamin, and yet he still is losing his night vision. Nutrients are part of a complex web of health. Too much of one thing or too little of another can mess with the levels of a particular nutrient which, in turn, can have a chain effect with numerous other nutrients. Consider calcium and how it is processed (Calcium: Nutrient Combination and Ratios); vitamin A plays a central role in bone development, and so maldevelopment of the skull in constricting the cornea or optic nerve is another way deficiency can negatively impact eyesight. Or consider that, “Dietary antioxidants (i.e., vitamin E) also appear to have an important effect on the utilization and perhaps absorption of carotenoids. It is uncertain whether the antioxidants contribute directly to efficient absorption or whether they protect both carotene and vitamin A from oxidative stress. Protein deficiency reduces absorption of carotene from the intestine” (Lee Russell McDowell, Vitamins in Animal Nutrition, pp. 16-17). We humans aren’t smart enough to outsmart nature (Hubris of Nutritionism). The body as a biological system is too complex and there are too many moving parts. Any single thing shifts and everything else follows. The only guaranteed healthy solution is to adhere to a traditional diet that includes plenty of fatty animal foods. This is easy enough for omnivores and carnivores — get plenty of liver and fish (or get it in the form of fish oil and cod liver oil), although any high quality fatty meats will work. And if you’re vegetarian, emphasize pasture-raised eggs and dairy. But for vegans, I can only suggest that you pray to God that you have perfect genetics, perfect metabolism, perfect balance of supplements, and all other aspects of optimal functioning that allows you to be the rare individual who has a high conversion rate of beta-carotene to vitamin A, along with conversion of other precursors (4 Reasons Why Some People Do Well as Vegans (While Others Fail Miserably)) — good luck with that!

To lighten up the mood, I’ll end with a fun factoid. Talking about genetics, an important element is epigenetics. Catherine Shanahan, quoted once already, has written an interesting discussion in her book, Deep Nutrition, that covers the interaction of nutrition with both genetics and epigenetics. Health problems from deficiencies can be passed on, but they can also be reversed when the nutrient is added back into the diet: “One example of the logic underlying DNA’s behavior can be found by observing the effects of vitamin A deficiency. In the late 1930s, Professor Fred Hale, of the Texas Agricultural Experiment Station at College Station, was able to deprive pigs of vitamin A before conception in such a way that mothers would reliably produce a litter without any eyeballs. 50 When these mothers were fed vitamin A, the next litters developed normal eyeballs, suggesting that eyeball growth was not switched off due to (permanent) mutation, but to a temporary epigenetic modification. Vitamin A is derived from retinoids, which come from plants, which in turn depend on sunlight. So in responding to the absence of vitamin A by turning off the genes to grow eyes, it is as if DNA interpreted the lack of vitamin A as a lack of light, or a lightless environment in which eyes would be of no use. The eyeless pigs had lids, very much like blind cave salamanders. It’s possible that these and other blind cave dwellers have undergone a similar epigenetic modification of the genes controlling eye growth in response to low levels of vitamin A in a lightless, plantless cave environment” (p. 57). The body is amazing in what it can do, when we give it the nourishment it needs. Heck, it’s kind of amazing even when we malnourish ourselves and the body still tries to compensate.

* * *

Vitamin A Under Attack Down Under
by Nora Gedgaudas

Traditional and indigenous diets have always venerated foods rich in fat-soluble nutrients and women either pregnant and/or seeking to become pregnant in traditional and indigenous societies (according to the exhaustive and well-documented research of Dr. Weston A. Price, author of the respected and acclaimed textbook, ‘Nutrition and Physical Degeneration’) ate diets rich in fats and fat-soluble nutrients–including liver–for this very purpose. The notion that these foods have somehow–all of a sudden—become toxic to us at any age is patently absurd. In fact, Price—himself a meticulous researcher in the 1930’s determined that traditional/indigenous societies readily consumed more than 10-times the levels of these nutrients—easily estimable at about 50,000 IU per day of preformed vitamin A, as compared to the levels of vitamin A consumed in (his) “modern times”. And people in Weston Price’s “modern day era” (1930’s) were nowhere near as hysterically phobic about foods such as egg yolks, shellfish and liver as we have since become following the fabrication of irrational concerns about such foods (and animal source foods/fats, in general)! Let’s just say we’re not necessarily healthier as a species for consuming at least ten-times less of these vital and protective, activated fat-soluble nutrients today; much less are we enjoying fewer birth defects or improved overall maternal/infant health. In fact, the closer we as a society attempt to emulate government guidelines, the less healthy (according to the latest confirming research) we demonstrably become.

Primal Body, Primal Mind
by Nora Gedgaudas
p. 139

The role of certain nutrients in relation to others and the need for certain cofactors in order to optimize a nutrient’s function or prevent imbalances aren’t normally discussed at all. This, of course, leads to problems.

For instance—and perhaps critically—for each and every receptor for vitamin D, there are two receptors for vitamin A on every cell. Because of the compartmentalized approach to vitamin D research, this sort of thing does not get recognized or discussed. A relative balance of these two nutrients is vital to their healthy functioning in the body. An excess of one can create a relative deficiency of the other. For instance, if you take large amounts of vitamin D without vitamin A, you are potentially more likely to develop symptoms of vitamin A deficiency and experience an actual immunosuppressive effect. Conversely, taking certain commercial cod-liver oil supplements that are rich in vitamin A but poor in vitamin D can lead to more severe vitamin D deficiencies. (It’s important to read labels. The amount of vitamin D in a serving of high-vitamin cod-liver oil is around 1,000 IU. Most commercial brands don’t exceed between 20 and 400 IU). Recent research from Spain indicates that vitamin A is necessary for both vitamin D binding and vitamin D release to receptor sites. The two vitamins are synergistic and should always be balanced in the diet or in supplementation. Individual needs for both may vary considerably.

Fat Soluble Vitamins: Vitamins A, D, E & K
by Jenny MacGruther

Carotenoids which include the very prevalent beta carotene are poorly converted by the body. For example, some studies indicate that the body requires as much as twenty-one times the amount of carotenoids to create the same amount of vitamin A is one part retinol. To add insult to injury many people, especially those suffering from thyroid disorders and small children, are even poorer converters. A 2001 study found that the conversion rate of carotenoids to true vitamin A is so poor as to render it nutritionally insignificant.

Why You Won’t Get Vitamin A From Carrots
by Lauren Geertsen

The most important fact about vitamin A is the difference between retinoids and cartenoids. The vitamin A from animal sources is retinoids, also called retinol, while plant source vitamin A is carotenoids, such as beta carotene.

Animal sources of retinol is bio-available, which means the body can utilize it. The vitamin A from plant sources, in contrast, must first be converted to retinol to be useful in the body. This poses two big problems.

First, when we are in pristine health, it requires at least six units of carotenes to convert into 1 unit of retinol (source). To put this in perspective, that means one must eat 4 1/2 pounds of carrots to potentially get the amount of useable A as in 3 oz. of beef liver (source). What happens if we have digestive issues, hormone imbalances, or other health problems? It requires even more units of carotene in the ratio.

Second, the carotene-to-retinol conversion is HIGHLY compromised. As a matter of fact, this conversion is negligible for many individuals. This conversion is virtually insignificant:

  • In infants
  • In those with poor thyroid function (hypothyroidism)
  • In those with diabetes
  • In those who are on a low fat diet or have a history of low fat dieting
  • In those who have compromised bile production (think: gallbladder and digestive issues) (source and source)

So, do you still think carrots are a vitamin A food? As with other orange veggies, sweet potatoes provide carotenes. Although beta carotene is an antioxidant, it is not true vitamin A. We must eat true vitamin A foods on a daily basis to meet our requirements for this essential nutritient.

Beef or Carrots
by Doug Garrison

This vitamins proper name “retinol” refers to its role in supporting vision.  Growing up we were told to eat our carrots for healthy eyes, especially to have night vision like cats!  Hmm, do cats eat carrots?  It is the “carotene” in carrots that our bodies can (with effort) convert into vitamin A.  The drawbacks to relying on carrots for your vitamin A:

  • We must use a biochemical reaction with bile salts and enzymes to convert the beta-carotene in carrots into vitamin A.
  • The conversion rate of beta-carotene to vitamin A in each person depends on many factors and ranges from 4 to 28 beta-carotene units to produce one unit of vitamin A.

For an adult male to meet the daily recommended intake for vitamin A, he would need to consume 2 pounds of baby carrots.  (Skipping the baby carrots, he could do one pound of regular carrots, for some reason baby carrots have half the beta-carotene.  Chlorine bath anyone?)  Don’t want to eat that many carrots?  How about 2.3 pounds of kale?  If you are like me, kind of lazy, I’ll opt for my vitamin A already formed in some beef liver.  Less than 1 ounce of beef liver will do the trick.

Still want to get your Vitamin A from carrots?  Boost your bodies conversion rate by eating carrots with animal fat such as cooking carrots with a pasture grazed beef roast!  In fact, we cannot convert the beta carotene found in plants without fat in our diet as a catalyst.

Vitamin A Deficiency: Health, Survival, and Vision
by Alfred Sommer, Keith P. West, James A. Olson, and A. Catharine Ross
p. 101

The ancient Egyptians and Greeks recognized nyctalopia and treated it with calf’s or goat’s liver (high in vitamin A content). By the nineteenth century, nightblindness was known to occur primarily among the poorer strata of society, particularly during periods of dietary deprivation; was exacerbated by photic stress (which bleached so much rhodopsin that synthesis could not keep up with demand, causing borderline deficiency to become manifest as nightblindness), and could be effectively treated with liver or liver oils. In fact, most other manifestations of xerophthalmia were first recognized by their association with nightblindness. Nightblindness (without evidence of xerphthalmia) was reported to have disabled Confederate soldiers between dawn and dusk, and to have affected whole regiments during the Crimean War.

Evolutionary Aspects of Nutrition and Health
edited by Artemis P. Simopoulos
p. 26
“Cereal Grains: Humanity’s Double-Edged Sword”
by Loren Cordain

Vitamin A deficiency remains one of the major public health nutritional problems in the third world [24]. Twenty to 40 million children worldwide are estimated to have at least mild vitamin A deficiency [25]. Vitamin A deficiency is a leading cause of xerophthalmia and blindness among children and also a major determinant of childhood morbidity and mortality [26]. In virtually all infectious diseases, vitamin A deficiency is known to result in greater frequency, severity, or mortality [27]. A recent meta-analysis [280 from 20 randomized controlled trials of vitamin A supplementation in third world children has shown a 30-38% reduction in all cause mortality in vitamin A-supplemented children. Analysis of cause-specific mortality showed vitamin A supplementation elicited a reduction in deaths from diarrheal disease by 39%, from respiratory disease by 70% and from all other causes of death by 34% [28]. Clearly, the displacement of beta-carotene-containing fruits and vegetables and vitamin A-containing foods (milk fat, egg yolks and organ meats) by excessive consumption of cereal grains plays a major role in the etiology of vitamin A deficiency in third world children.

Malnutrition and the Eye
by Donald McLaren
pp. 165-171

Few effective cures have been known so long to mankind as that of liver for night blindness. No doubt this was due in part to the dramatic nature of both the onset of the condition and of its relief. It is probable that the Ebers papyrus, written about 1600 B.C. in Egypt, referred to night blindness when it recommended liver for treatment of the eyes. A literal translation reads “Another [prescription] for the eyes: liver of ox roasted and pressed, give for it. Very excellent” (Drummond and Wilbraham, 1939). At about the same time the physicians in China were giving liver, dung of the flying fox, and tortoise shell for the cure of night blindness (Read, 1936). Hippocrates prescribed the whole liver of an ox dipped in honey and the therapeutic value of liver was also known to later Roman writers. It is believed that Celsus (25 B.C.-50 A. D.) first used the term xerophthalmia. […]

It would seem that night blindness was widespread in Europe in medieval times, for we find a 14th century poet in Holland, Jacob van Maerland, referring to the disease and its cure in this way (Bicknell and Prescott, 1953):

He who cannot see at night
Must eat the liver of the goat.
Then he can see all right.

[…] The relationship to poor general health and infectious disease was noted frequently in early accounts of xerophthalmia with special attention paid to intestinal disorders (Teuscher, 1867; de Gouvea, 1883). Baas (1894) described both night blindness and xerophthalmia in patients with liver disease and there have been many confirmatory accounts since. There is now reason to believe that impairment of dark adaptation in patients with disease of the liver may not always be due to deficiency of vitamin A […]

Although the cure for night blindness had been known since time immemorial, ti was not until the last century that the dietary deficiency nature of the condition was recognized. […] With the turn of the century several further steps forward were taken in the understanding of the nature of the disease. Jensen (1903) was the first to show that xerophthalmia could be cured by an adequate diet and for this purpose used raw cow’s milk. It is interesting to note that he observed a rapid improvement on this regime not only as judged by the condition of the eyes and gain in weight but particularly by the disappearance of what he called the “characteristic psychic indifference.” This recognition of the profound systemic effects of vitamin A deficiency has not always persisted since this time and the high mortality attributable to the disease in its severest form has also been lost sight of at times.

In 1904 the important observation was made by Mori that the disease known as “hikan,” characterized by conjunctival xerosiss and keratomalacia and widely prevalent among children aged 2-5 years in Japan, was most common in the children of people living largely on rice, barley and other cereals, beans, and vegetables. It did not occur among fisher folk, and cod liver oil, chicken liver, and eel fat were all effective remedies.

The association of xerophthalmia with an excessive intake of carbohydrate in the diet in infancy was recorded by Czerny and Keller (1906) in their classical monograph on the syndrome they termed Mehlnahrschaden. It is now recognized that this condition is identical in all basic features to what has been called “the most serious and widespread nutritional disorder known to medical and nutritional science” (Brock and Autret, 1952) and due in essence to a deficiency of protein and excess of carbohydrate in the diet. Many local and other names have been applied to this disease but it will be necessary here to use one, and that chosen, “kwashiorkor,” has found wide acceptance. Since Cerny’s day there has been a great number of other accounts in which ocular involvement has been described (McLaren, 1958), providing good evidence for the contention that a deficiency of vitamin A is the most common of all vitamin deficiencies associated with kwashiorkor.

Handbook of Nutrition, Diet, and the Eye
edited by Victor R. Preedy
p. 301
“Vitamin A, Zinc, Dark Adaptation, and Liver Disease”
by Winsome Abbot-Johnson and Paul Kerlin

McCollum and Davis (1912) found that ‘fat-soluble factor A’ was essential for growth in rats. The important connection between vitamin A deficiency and night blindness however was made by Frederica and Holm in 1925, who observed slower generation of visual purple in light-adapted vitamin A-deficient rats than for normal rats when put into the dark.

A relationship between night blindness and cirrhosis was reported by Haig et al. in 1938 and Patek and Haig in 1939. It was thought that these patients may be deficient in vitamin A and the deficiency state was not thought to be attributable to inadequate intake of the vitamin in their food. Impairments of dark adaptation (DA) included delayed rod cone break (time when rods become more sensitive to light than cones), higher intensity of light seen at 20 minutes (postexposure to a bright light), and higher intensity of light seen at final reading (elevated final rod thresholds). Nineteen of 24 patients demonstrated night blindness but none was aware of this on direct questioning.

The Vitamin A Story
by Richard D. Semba
p. 76

The piecemeal clinical picture of night blindness caused by vitamin A deficiency (see previous chapters) finally came together between 1896 and 1904, when Japanese physician Masamichi Mori described more than fifteen hundred children with hikan — that is, xerophthalmia [37]. Mori had studied medicine at the Mie Prefectual Medical School and the Tokyo University and gone on to work in Germany and Switzerland before returning to Mie Prefecture to practice surgery. The children Mori described had night blindness, Bitot’s spots, corneal ulceration, keratomalacia, and diarrhea. The death rate among them was high. Most were between ages one and four and one-half, and many came from poor families living in mountainous regions, where the diet completely lacked milk and fish. Once under medical care, the children were given cod liver oil daily, and this proved to be a effective treatment for both the eye lesions and diarrhea. Contrary to the view of many physicians, Mori concluded that the disease was not infectious but rather was caused by the lack of fat in the diet.

Fat-Soluble Vitamins
edited by Peter J. Quinn and Valerian E. Kagan
p. 150
“Plasma Vitamins A and E in HIV-Positive Patients”
by Joel Constans, Evelyne Peuchant, Claire Sergent, and Claude Conri

During the last 10 years it has been demonstrated that vitamin A deficiency not only results in xerophthalmia and blindness, but also in mortality and susceptibility to infectious diseases (Lammer et al., 1985; Reddy et al., 1986). Treatment with massive intermittent dosages of vitamin A has resulted in a decrease in mortality in developing countries (Sommer et al., 1986). It has been suggested that vitamin A might have a positive effect on the immune system and that marginal deficiencies in vitamin A that are unable to give rise to xerphthalmia and blindness might impair immune defenses (Bates, 1995; Sommer et al., 1984). Deficiency in vitamin A clearly has effects on the immune system, and the number, and the function of natural killer cells were depressed in vitamin A-deficient rats (Bates, 1995). Supplementation with vitamin A (60 mg retinol equivalents) resulted in higher CD4/CD8 ratio, higher CD4 naive T cells, and lower proportions of CD8 and CD45 RO T cells in vitamin A-deficient children compared to placebo-treated children (Semba et al., 1993b). Vitamin A deficiency might also result in depression of humoral response to proteins and alterations of mucosal sufraces (Bates, 1995). Watson et al. (1988) reported that high vitamin A given to mice with retroviral infection increased survival and numbers of macrophages and total T lymphcytes.

Vitamin History: The Early Years
by Lee McDowell
pp. 61-66

IV. Xeropthalmia And Night Blindness History, From Antiquity To 14th Century

Vitamin A deficiency is one of the oldest recorded medical conditions, long recognized as eye manifestations. For thousands of years humans and animals have suffered from vitamin A deficiency, typified by night blindness and xerophthalmia. The cause was unknown, but it was recognized that consumption of animal and fish livers had curative powers according to records and folklore from early civilizations. It is interesting to find the knowledge of the cure is almost as old as medicine.

Night blindness and its successful treatment with animal liver was known to the ancient Egyptians (Fig. 3.4). Eber’s Papyrus, an Egyptian medical treatise of between 1520-1600 B.C., recommends eating roast ox liver, or the liver of black cocks, to cure it (Aykroyd, 1958). Wolf (1978) notes that a more careful evaluation of ancient Egyptian writings (Eber’s Papyrus no. 351) reveals that the therapy consisted of topical application of vitamin A rich liver juice to the eyes. Wolf (1978) suggested that with the topical application some of the liver oil must enter the lacrimal duct and thereby reach the throat via the nose. Therefore, the vitamin A could enter the body despite its topical application.

Vitamin deficiency diseases in China such as xerophthalmia and night blindness had been very prevalent from olden times (Lee, 1940). For preventing night blindness the Chinese in 1500 B.C. were giving liver, honey, flying fox dung and tortoise shell, all of which would have cured night blindness (Bicknell and Prescott, 1955).

The term “sparrow eyed” was used for a man who could see in daytime but not at twilight. The sparrow also has vision problems at night. Even though the ancient Chinese did not know the real cause of night blindness, they knew it was caused by a nutritional disturbance. A report from the 7th century notes that pig’s liver could cure night blindness.

In the Bible book “Tobit”, blindness apparently due to vitamin A deficiency, is described. The setting of the story is the latter part of the 8 th century B.C. in the Assyrian capital of Nineveh where the people of Northern Israel had been taken captive. In this book God sends the angel Raphael who tells Tobit’s son to rub the eyes of Tobit with fish bile. After this Tobit was no longer blind (Grabman, 1973). Around 600 B.C. an early reference to vitamin A deficiency in livestock is in the Bible (Jeremiah 14:6, King James version): “and the asses did stand in high places, their eyes did fail, because there was no grass.”

Evaluation of medicine of the Assyrian-Babylonian empires (900-400 B.C.) report eye diseases or conditions (Krause, 1934). The Babylonian word sin-lurma (night blindness) was described as “a man can see everything by day but can see nothing by night.” The prescription for cure was to use a concoction whose major ingredient was “liver of an ass.” The procedure was for a priest to say to the person with night blindness “receive, o dim of the eye”. Next the liver-based potion was applied to the eyes. Also for xerophthalmia a type of prescription that would provide vitamin A was “thou shalt disembowel a yellow frog, mix its gall in curd, apply to eyes”.

The old Greek, Roman and Arab physicians recommended an internal and external therapy with livers of goats to overcome night blindness. The Greek Hippocrates, who lived 460-325 B.C., recognized night blindness and recommended eating raw ox liver dipped in honey as a cure (Littre, 1861). The notation was to “eat, once or twice, as big an ox-liver as possible, raw, and dipped in honey”. To eat a whole ox liver seems a superhuman feat, even when we reflect that the ox of antiquity was a much smaller creature than that of today (Figure 3.4).

Mani in 1953 reviewed the Greek therapy for night blindness (cited by Wolf, 1978). A precise definition of night blindness is not found until after the time of Hippocrates. Galen (130 to 200 AD) describes patients who are “blind at night” and Oribasius (325 AD) defines night blindness as “vision is good during the day and declines at sundown, one cannot distinguish anything any longer at night”. Galen recommends the cure for night blindness as “continuous eating of roasted or boiled liver of goats”. He also suggests, as did the Egyptians a topical treatment, “the juice of the roasted liver should be painted on the eyes”.

Xerophthalmia had been known as hikan in Japan since antiquity (Mori, 1904). Mori stated that hikan was common among people who subsisted in great measure on rice, barley and other cereals, beans and other vegetables, whereas it did not occur among fisher folk. He not only recognized the entire sequence of ocular changes in xerophthalmia, but also the central role of dietary deficiency of fats (particularly fish liver oils) resulting from either a faulty diet or faulty absorption, the role of diarrhea, kwashiorkor (protein deficiency) and other contributory and precipitating events. Mori reports that administration of cod liver oil was followed by speedy relief from the disorder and that chicken livers and eel fat were effective remedies also. He incorrectly concluded that deficiency of fat in the diet was the cause of the disease.

V. Xeropthalmia And Night Blindness History, 1300-1900

Liver was the most widely used cure for night blindness. Jacob van Maerland, a Dutch poet of the 14th century concluded: “He, who cannot see at night, must eat the liver of the goat. Then he can see all right” (Bicknell and Prescott, 1955).

Guillemeau in France in the 16th century clearly described night blindness and advised liver for its cure (Bicknell, and Prescott, 1955), which was also advised by other writers during this time.

The first mention of liver for the eyes in England was in Muffett’s “Health’s Improvement” (1655), though Bayly, at one time Queen Elizabeth’s physician, in his book on eyes recommends “rawe herbes” among which is “eie bright” (Drummond, 1920). The only evidence of night blindness being common at this time is references to mists and films over the eyes “rawe herbes” would of course provide provitamin A (Bicknell and Prescott, 1955).

In 1657 Hofer expressed the view that night blindness is caused by malnutrition with this thought reintroduced nearly 100 years later in 1754 by von Bergen (Rosenberg, 1942). In 1754 he also speculated that night blindness might be due to excessive exposure to sunlight. This would later be confirmed as light increases the need for regeneration of retinal visual pigment. In a review of literature Hicks (1867) noted that night blindness was noticed by Baron Young in Napoleon’s Egyptian campaign in 1798.

At one time, night blindness was a typical disease of seafarers, due to the lack of fresh food. Aykroyd (1944) in his accounts of Newfoundland and Labrador fishermen noted they not only recognized how bright sunlight may bring on night blindness, but also used liver, preferably the raw liver of a gull or puffin for a cure. In rural communities inability to see in the dusk is a very serious condition; fishermen, for instance, may walk off the rocks into the sea after landing in the evening. Night blindness can be cured, often in 12 hours, by eating food rich in vitamin A, such as liver. In addition to eating liver, patients with night blindness were recommended to hold their heads over steam rising from the roasting liver. The dramatic quickness both of the onset and cure explains why liver has been used for centuries for prevention and cure of night blindness (Bicknell and Prescott, 1955).

Other regions of the world where liver was used to control night blindness was in central Africa. Medicine men in Ruanda-Urandi prescribed chicken liver to cure night blindness (Tseng Lui and Roels, 1980). The origins and original date of implementation of their therapy are unknown.

During the Lewis-Clarke expedition (1804-1806) to open up the far west in the United States, there were a number of men who developed severe eye troubles toward the end of the trip. These men had lived for long periods upon buffalo, dog and horsemeat, muscle of these animals is low in fat-soluble vitamins (McCay, 1973).

Magendie (1816) in France studied lack of protein in dog diets by feeding the animals sugar and water. Magendie noted that during the third week, dogs were thin, lost liveliness, had decreased appetite and developed small ulceration in the center of the transparent cornea. The ulcer was first on one eye and then the other, it increased rapidly and at the end of a few days was more than two millimeters in diameter and in depth. Soon the cornea was entirely pierced and the fluid of the eye was flowing out. An abundant secretion of the glands of the eyelids accomplished this singular phenomenon. Magendie repeated this experiment twice more with identical results.

Although xerophthalmia had been known in ancient Egypt, Magendie appears to be the first to experimentally produce the condition. Not only did Magendie record the production of xerophthalmia in animals, he recognized the analogous conditions in man as a result of a restricted diet. In his report Magendie noted an experiment by an English doctor named Stark. Stark lived on an exclusive diet of sugar for one month. In his eyes appeared livid red spots, which seemed to announce the approach of an ulcer (xerophthalmia). C.M. McCay in 1930 suggested that Magendie was the father of the vitamin hypothesis.

In a journey from Calcutta to Bombay, India in 1824-1825 vitamin A deficiency was observed (Aykroyd, 1944). Individuals would sometime describe their condition as being “night blind.” By the mid-1800s, xerophthalmia was recognized in many areas of Europe, particularly in Russia during the long Lenten fasts, (Blessig, 1866) the United States (Hicks, 1867) and elsewhere around the world (Sommer and West, 1996). Hubbenet (1860) reports night blindness in the Crimean War (1853-1856).

Inflammation of the cornea and conjunctiva associated with night blindness had been ascribed to “defective nutriment” by Budd in 1842. Wilde (1851) made similar conclusions during the Ireland famine in 1851. For treatment he recommended cod-liver oil.

In 1857 David Livingstone, a medical missionary in Africa, described eye effects for his native carriers when they were forced by circumstances to subsist for a time on coffee, manioc and meal (Livingston, 1905). He noted that the eyes became affected (e.g. ulceration of the cornea) as they did in animals receiving starch. He was probably referring to the dog study of Magendie in 1816.

Hicks (1867) a doctor in the confederate army, described night blindness in the U.S. Civil War (1861-1865). The disease was found to be in the army of Northern Virginia so extensively as to resemble an epidemic. Soldiers attributed it to the “effect of the moon-light falling upon their eyes while sleeping upon the ground.” The soldier, who had marched all day without problems, would complain of blindness upon the approach of early twilight, and make immediate application for transportation in an ambulance. At such times he would be found blundering along just as a blind man, holding on to the arm of his companion. For those with night blindness being examined at night by candle-light, the eye pupil was found dilated, and refusing to respond to the stimulus of this light.

To overcome night blindness, extreme treatments such as cupping, leeching, blistering, iron, mercury and potash were used extensively, but most often did more harm than good (Hicks, 1867). Cases frequently recovered spontaneously after all treatments had been abandoned. Hicks (1867) observed that a furlough from the army was most beneficial to cure night blindness. It was noted that poverty, filth and the absence of vegetables were associated with night blindness. The disease was found to be most prevalent when symptoms of scurvy were also observed. Vegetables were observed to be of benefit for both scurvy and night blindness.

In 1883 De Gouvea described night blindness in poorly nourished slaves in Brazil. He noted that the slaves were unable to see when returning from work after sunset, but could see well when starting for work before sunrise. Their food was beans, pork fat and corn meal. Exposure to sunlight was suspected of inducing night blindness, and resting the eyes at night was believed to result in recovery.

In the 1800’s a number of investigators related malnutrition to night blindness and more clearly described eye problems as related to malnutrition The researchers, cited by Rosenberg (1942) and Jayle et al. (1959), included Bamfield (1814), Schutte (1824), Foerster (1858), Von Graefe (1859), Hubbenet (1860), Netter (1863), Bitot (1863), Toporow (1885), Kubli(1887) and Parinaud (1881). Bitot’s name is widely known in conjunction with his observation on conjunctival and corneal changes in vitamin A deficiency. Hubbenet (1860) observed children in a French orphanage and described the progression of xerophthalmia from night blindness through conjunctival and corneal involvement, attributing it to a faulty diet. Toporow (1885) called attention to the importance of fats and Schütte (1824) and Kubli (1887) to that of cod liver oil in prevention of night blindness (cited by Jayle et al., 1959; Loosli, 1991). Parinaud (1881) opened a new era in this field by connecting night blindness with a slowing down in the regeneration of retinal pigment.

VI. Relationship Of Cod Liver Oil To Eye Disease

In the early years of recorded history, liver was the principal food in a number of societies that was beneficial for night blindness control. The discovery of vitamins A and D was closely related to studies with cod liver oil. This oil has been used since very early times by the Greenlanders, Laplanders and Eskimos (Loosli, 1991). Harris (1955) reviewed the early medical uses of cod liver oil. The earliest medical record was for treatment of rickets in 1766 in Manchester, England. In 1848 a treatise on cod-liver oil, published in Edinburgh, describes how xerophthalmia may be cured with the oil. In 1874 Dusart in a Paris research bulletin noted that his common treatments were wine, quinine, cod liver oil and body massage. The beneficial effect of cod liver oil in the treatment of rickets, osteomalacia, generalized malnourishment, and certain eye conditions was widely recognized by the middle of the 19th century.

Cod liver oil used to be treated in wooden barrels that had some small holes bored in the side. These holes were plugged with pegs. As the fisherman cleaned the cod for salting and drying, they threw the livers into these barrels. After the livers rotted, the oil was set free and rose to the top. This could then be taken off through the holes in the side of the barrel. This cod liver oil had many of the attributes of medicines of olden times, namely a dark brown color, an unpleasant odor and a nauseating flavor. Historically cod liver oil was used in the treatment of both eye disease and rickets.

In a particularly thoughtful and well-documented study published in 1881, Snell demonstrated that cod liver oil would cure both night blindness and Bitot’s spots. Within a decade, meat, milk, and cod liver oil were routinely administered for corneal ulceration and dissolution (keratomalacia). In 1904 Mori gave an elaborate account of large numbers of cases of xeropthalmia in Japan and how cod-liver oil cured the condition.

Vitamin D3 and Autophagy

Vitamin D3, a fat-soluble vitamin, is one of the most important micronutrients. I won’t describe all of its health benefits. But the effect on the body can be more like a hormone in how powerful it influences numerous physiological processes and systems.

Here is what I’ll emphasize for the moment, as an example of how to think about health in a more complex way. Unless you live near the equator and are near naked outside in the sun for most of the day, you are guaranteed to not be getting enough vitamin D3 through your body’s own production of it. The only other natural source is from animal foods. So, be sure to eat plenty of fatty animal foods from pasture-raised animals, especially organ meats, eggs, and dairy.

Let me throw out the issue of autophagy. Eating protein, as with eating carbs or really anything, shuts down autophagy. And we want some autophagy (i.e., cellular repair and regrowth) as it is essential to health and longevity. Some people blame protein for lack of autophagy, but that is nonsense. It is no more to blame than anything else. Sure, you should fast from protein on occasion. Then again, you should fast from everything on occasion. But fasting won’t give you the benefits of autophagy if you don’t have all that is required to make this possible. Guess which nutrient enhances autophagy? Yep, vitamin D3.

Someone severely restricting their protein consumption is unintentionally also restricting their vitamin D3 intake. They’ll have a harder time getting into full autophagy with all of its benefits. This is even more true for those, in avoiding fatty meats, eat a high-carb/low-fat diet instead. Not only are they not getting healthy amounts of vitamin D3 for they also aren’t regularly in ketosis. And one has to first be in ketosis before one can be in autophagy. On a high-fat ketogenic diet, all that it will take to get autophagy is a relatively shorter fast because the body is already fully primed for it.

It is true that eating protein shuts down autophagy in up-regulating what causes biological growth by way of mTOR and IFG1. That isn’t a bad thing. We want our bodies to grow, just as we also want our bodies to repair. The optimal condition is to cycle back and forth between these two states. Vitamin D3 from fatty animal foods is key for both, as it promotes bone growth and promotes autophagy, among much else. Don’t deny yourself. Enjoy those delicious fats from high quality sources. Feast until satiation and, to balance it out, fast on occasion.

* * *

As a side note, deficiency in vitamin D3 is associated with such things as Alzheimer’s.

It makes me wonder if that is related to the role of vitamin D3 in autophagy. Alzheimer’s is accumulated damage involving (among other factors) insulin resistance and inflammation, both of which would relate to low-carb/high fat diets along with ketosis and autophagy.

But vitamin D3 out of balance can also be a problem, as it works closely with the fat-soluble vitamin A (as beta-carotene). Vitamins A and D3 form a fat-soluble trio with vitamin K2. You can learn more about this from Kate Rheaume-Bleue, although credit must be given to Weston A. Price.

* * *