Eye-based paternity testing and other human genetics myths - Genome engineering: It never ends well

Putting the science in fiction - Dan Koboldt, Chuck Wendig 2018

Eye-based paternity testing and other human genetics myths
Genome engineering: It never ends well

by Dan Koboldt

In 2001, scientists announced an incredible accomplishment: They had completed the sequence of the human genome. The complete instruction book for making a human being spans 24 chromosomes and is 3.2 billion letters long. That’s about one thousand times the length of the first ten The Wheel of Time books put together. Sequencing the whole thing had taken ten years and something like eight billion dollars.

That’s a considerable investment for taxpayers, but the scientists made incredible promises. They said it would be the scientific breakthrough of the century. With the sequence of the genome in hand, they promised to dramatically improve the prevention, diagnosis, and treatment of disease. They told us the completion of the human genome would mark a new era for human health.

They lied.

Well, that’s not entirely fair. Finishing the genome was the starting point in a long journey to understanding how our genes make us who we are. The more they study it, the more scientists have found that the genome is incredibly complex. I know, because I’m one of them. I work as a human geneticist at a major children’s hospital.

Unfortunately, few things about genetics and inheritance are straightforward. They’re certainly not as simple as we often see them portrayed in books, movies, and other media. As a scientist who also enjoys science fiction, I often encounter popular misconceptions about how genetics actually works. Here are a few of the more common (and inaccurate) tropes.

Trope #1: The eye-based paternity test

Oh, if I had a dime for every time a character recognized a long-lost parent or sibling based on eye color, a widow’s peak, a peanut allergy, or some other physical quirk. Sure, first-degree relatives do tend to look alike, and many visible traits tend to run in families. Yet they should not be used to establish (or disprove) kinship because it’s not that simple.

Eye color, despite the common wisdom suggesting otherwise, is a complex inherited trait. While it’s true that blue eyes tend to be recessive and brown eyes tend to be dominant, eye color is a spectrum, not a multiple-choice test. The color of the iris is determined by the amount of melanin in it, and that can be influenced by as many as ten different genes. Brown-eyed parents can have blue-eyed children, and vice versa. Also, eye color can change: Many newborns have blue eyes that become brown or green during early childhood.

Please, don’t rely on physical characteristics to tell who’s related to whom. The inheritance of such traits does not always follow a predictable pattern. Even when it does, in real life, these kinds of tests might uncover secrets that were better left buried.

When we do genetics studies of families, we verify the expected relationships as a quality control step. About 4 percent of the time, there’s a discrepancy (most often, the reported father is not the biological father). This observation holds true across racial groups and socioeconomic strata, and has been consistently reported by many researchers for over a decade.

We call these non-paternity events and, generally speaking, we don’t report them back to the study participants.

Trope #2: Different people have different genes

Often I hear people discussing how someone has “the gene” for some trait or ability. Alternatively, an elderly person in good health is often said to have “good genes.” In truth, we all have the same set of about twenty thousand genes. In very rare cases, large segments of the genome can be deleted (which removes genes), and usually that’s a very bad thing. So the concept of people having “different genes” is not accurate. We all basically have the same set of genes. However, the base pairs in those genes can differ from one person to the next, resulting in slight differences in when and how those genes work. That’s what makes us all slightly different from one another.

That being said, I recognize that most people use the term genes colloquially. I don’t expect people to start saying, “So you’re ninety-five years old? You must have a really good set of genetic variants in and around your genes.” Even if that would make me happy.

While we’re on the topic, I should tell you that traditionally defined genes—that is, things that code for proteins—occupy only about 1.5 percent of the human genome. Non-coding sequences make up the rest of it. Some of them may regulate when or how much certain genes are turned on, or help organize the genome inside the cell. Still others provide physical structures that serve another purpose, such as the repetitive sequences that make up the telomeres (ends) of chromosomes.

But much of the genome either has no specific function or serves a purpose that we haven’t yet uncovered.

Trope #3: Your genetic destiny is written

Gattaca became one of my favorite science fiction movies long before I entered the field of genetics. It portrays a near-future dystopian society in which the worth and future potential of an individual are determined, at birth, with a genetic analysis. As a result, most parents take advantage of genetic selection/enhancement of embryos to get the ideal combination in their future child. These designer babies get the cool jobs, whereas babies born without such intervention are basically treated as invalids.

On the bright side, the idea of sequencing every person’s genome at birth is rapidly becoming more plausible. Thanks to the advent of next-generation DNA sequencing technologies, we can now sequence a human genome in less than a week, for a little over a thousand dollars. We can use that information to infer a lot about a person, such as ancestry, risk for certain diseases, and likely physical appearance. But we’re a long way off from predicting the lifetime risk for common diseases, like heart disease, diabetes, and psychiatric disorders.

Most of these result from complex interplay between genetic, lifestyle, and environmental factors. The vast majority of genetic variants associated with disease risk have a very small effect and may only increase your risk by 5 percent. There could be thousands of such genetic factors for any given disease, so predicting someone’s health at birth, even if we knew everything about the genome, would be a very complex problem.

One thing I particularly admired about Gattaca was how the protagonist’s genetic future was described in probabilities: neurological disorder, 60 percent; attention-deficit disorder, 89 percent; heart failure, 89 percent. There are few certainties in human genetics, and the movie did well to acknowledge this.

Trope #4: Mutations are awesome

Mutations, or acquired changes in DNA, are one of the most misunderstood topics in genetics. Too often in science fiction, I see mutations treated as good or advantageous things. A telling example comes from the 2002 movie Resident Evil, in which the Red Queen (a malicious artificial intelligence in control of things) releases a genetically engineered monster that attacks the group of heroes. After it makes a kill, the Red Queen says that after it feeds, it will mutate, then become something new. Presumably an even stronger, deadlier monster.

The reality is that mutation, for humans at least, is uncommon. Most of the genetic variation that we have, we inherited from our parents. New mutations that arise in a child but are absent from both parents are extremely rare. We’re talking about forty or fifty throughout the entire genome, compared to three to five million inherited genetic variants.

Generally speaking, new mutations are not beneficial. The human genome has been under natural selection for thousands of years. Think of it like a Formula One racecar. Mutations are like metal screws that you add (or remove) at random. More than likely, this won’t have any effect on the racecar, but if it does, you’re far more likely to break something than make it better.

The body’s cells also acquire mutations over time, sometimes by chance as cells divide, but also through DNA damage induced by radiation or carcinogens. Most cells that suffer damaging mutations will die. Occasionally, however, a cell gets the right set of mutations that allow it to grow and divide uncontrollably. When this happens, cancer is the result.

Trope #5: Most genetic traits are inevitable

I think that the most common myth about human genetics is that most traits are inherited in simple and/or inevitable fashion. The genetics taught in most high school biology classes—like dominant, recessive, and X-linked inheritance patterns—may be partially to blame for this. Mendel’s laws and the Punnett square (remember those square diagrams that you used to work out genetic crosses in biology?) only work for rare genetic conditions that are due to mutations in a single gene. Cystic fibrosis and sickle cell disease, for example, are recessive disorders caused by mutations in the CFTR and HBB genes, respectively.

Although Mendel’s laws offer a useful introduction to genetic inheritance, they become problematic when we try to apply them to more complex traits. In fiction, I often meet characters living under a specter of a disease that killed their grandparents and/or parents. It seems inevitable that they, too, will fall victim to it.

Alcoholism, for example, is a complex disorder that’s often treated simplistically: “My dad was an alcoholic, so I became one.”

I’m sorry to have to tell you this, but most of the traits that make for interesting characters—intelligence, attractiveness, physical/mental health, etc.—do not follow simple laws of inheritance. They might not be passed from parent to child, or shared by siblings. The genetics underlying these characteristics will undoubtedly be complicated.

Just like we are.