A little bit off topic, but even after years of active interest, I'm still amazed by the complexity of the human immune system.
Imagine this: we are all born with a functional immune system which is pre-programmed with knowledge of what bacteria, viruses, and many parasites look like, so it can immediately deal with these without prior exposure. This is the innate immune system, and in many organisms is the only immune system.
On top of that, a database is created which consists of fragments of all our bodies' molecules. This database is used to train the adaptive immune system. The thymus will then present these molecules to new white (T) cells, and screen out the ones that recognize these "self" molecules. This is the adaptive immune system.
Still on top of that, there's another tier, because maybe 0.1% of T cells escape the first-pass screening. You now have a series of checks and balances which screen for these escaped cells outside the thymus, and either reduce their functioning or eliminate them entirely. This is peripheral tolerance (what the Nobel prize in medicine was awarded for this year).
And when there's an actual infection, this system is able to spin up a few VMs, run a large bug-search model, and create a pool of tailor-made antibodies and T cells specific to the new bug, which in most cases are enough to deal with the infection.
So when all is said and done, and the system is trained and working as expected, you now have an immune platform which, on top of recognizing all its own molecules, can also recognize pathogens, including differentiating disease-causing ones from the benign ones; can also deal calmly with the enormous diversity of things we put in our mouths, noses, and other orifices; and in most cases doesn't actually go rogue.
A lot of science supports the idea that helicopter hyper-parenting is hurting kids by having them grow up in an environment that’s too sterile. Let your kids go outside and roll around in the mud a bit. God forbid they lick the floor. Science says it’s good for them!
The Hygiene Hypothesis, which is what you have described, fell out of favor in research circles during the late 2000s. What seems to be causing the allergy and autoimmunity epidemic is a loss of keystone species in the gut that have co-evolved with humans and provided essential services to the immune system, as well as stability to the entire ecosystem. See e.g. a famous review on this issue [1]. Other important factors might include novel protein antigens and small molecules we have not co-evolved with.
1. Early-life exposure to diverse, nonpathogenic microbes is linked to lower risk of allergies and asthma: "farm effect" (Amish vs. Hutterite study showing microbe-rich house dust associates with protection).
2. What matters is exposure to the right environmental and commensal microbes, not skipping handwashing or basic hygiene. Microbial diversity is good; pathogen exposure is not.
Prior to the EpiPen, people carried the Ana-Kit. It became commercially available in 1963 and was a little kit containing a syringe pre-loaded with epinephrine, antihistamine tablet(s?), and a tourniquet.
People in anaphylactic shock sometimes (often?) need more than one dose, and antihistamine should be taken asap. The epinephrine just bridges the gap until the antihistamine kicks in.
I liked the Ana-kit because the syringe had 2 doses in it (you turned the plunger 90° for the second dose) and the antihistamine. It was much cheaper, and it was pretty easy—- just pull off the needle cap, stick your thigh to the hilt, and press the plunger.
Despite the relative ease of autoinjectors like EpiPen, I was pretty upset when Ana-kit was discontinued and I had to start carrying EpiPens. That’s why I always get the generic 2-pack prescribed and keep it in a ziplock bag with a couple Benadryls.
I grew up in communist bloc, finely oppressed by russians with their military bases all around, ready for nuclear WWIII that never came.
Peanut butter isn't something I ever saw before being adult and well into 90s, it simply wasn't a thing, I guess evil capitalist invention with the only goal to subvert our fine communistic paradise, like ie Star wars movies. Raw peanuts were frequent though, I guess one of very few things that actually made it through very badly functioning central planning and wasnt stolen by aparatchicks and collaborants for themselves. Never ever met a single kid having any sort of peanut allergy during growing up, never knew its a thing. I recall one or two with asthma, hay fever and thats it. But same could be said about any form of mental diseases/issues for whatever reason, anxiety, adhd variants and so on... either ignored, undiagnosed or really on much lower levels, dont know.
Kid misbehaving? A fine smack or some other physical punishment settled things at least in primary school. Then things started to change a bit till they overcorrected these days.
This is such a strange modern take. Parents didn't "allow" their children around others. Unless they were royalty most wouldn't spend more than 40 minutes a day with their parents. We don't see the that concept coming to American until the late 80s, early 90s in affluent neighborhoods to socially distinguish themselves from "ghetto" families
tl/dr:
In families who use hand dishwashing, allergic diseases in children are less common than in children from families who use machine dishwashing. We speculate that a less-efficient dishwashing method may induce tolerance via increased microbial exposure.
I think I’ve heard three different comedians make a joke out of this saying if we just gave all the kids peanut butter the problem would eventually take care of itself.
But it’s not actually a joke.
I guess that’s the weird thing about jokes. They often get very close to something that is true but they come at it sideways, so it’s funny instead.
One of the difficult parts of this advice for me was that my daughter wasn't eating food at the time when we were supposed to introduce it. In those cases, you're supposed to add peanut butter to the milk, which we did a few times. We let it slip for a few weeks, because it was one more thing in a pile of many things. We got her back eating peanut butter once she started eating food, but it was too late. She had developed a peanut allergy.
After going through the desensitization program at an allergist, we're on a maintenance routine of two peanuts a day. It's like pulling teeth to get her to eat them. She hates peanut M&Ms, hates salted peanuts, hates honey rusted peanuts, hates plain peanuts, hates chocolate covered peanuts, hates peanut butter cookies, and will only eat six Bamba sticks if we spend 30 minutes making a game out of it.
I highly recommend being very rigorous about giving them the peanut exposure every single day. It would have saved us a lot of time.
My daughter was in the original study, back in 2012. It was very interesting, but a lot of work. She was randomised into the early introduction arm, so she had a whole load of allergens that she had to eat regularly. iirc, as well as peanuts it was egg, sesame, white fish, milk and wheat. There were several trips to London for lots of tests.
By the time my second daughter was born in 2014 they'd told us some of the preliminary results, so we followed the guidance even though she wasn't in the trial – no peanuts in the house so she couldn't get any on her skin until she was able to eat solids, and then peanut butter was her first solid food and we fed it to her throughout her infancy. I don't know if that's the cause, but she still eats it by the bucketload.
How long did you delay for? It's not like there's some tiny window of opportunity before 10 months or whatever. Consider that the Spanish conquistadors who literally never saw a peanut as a child and tried their first peanuts as adults all survived long enough to make peanuts a globally accepted food. You can't blame yourself. To think that somehow not delivering peanut exposure was a sure cause of the allergy is nonsense.
Prior to modern hygiene, most humans probably had worms, as well as having to constantly battle other pathogens. Immune systems had no time for peanuts in the face of these other threats.
I don't remember exactly, but I suspect that the introduction and then disappearance was worse than not introducing it at all until we could do it consistently. It was probably something like six weeks between giving up on peanut butter in her milk and starting her on solid food.
I'm not aware of a recommendation to give peanuts/other possible allergens that regularly, at least I'm certain that's not a thing where I live. The change was that peanuts before were avoided completely for years, and now they are added when it fits, like a peanut butter toast once in a while. Outside of the desensitization therapy you go through now, you do not give like two peanuts every day or put it in milk regularly. You just test for allergic reaction early and then stop thinking about it, that's the change.
So you did nothing wrong. The six week pause was completely meaningless.
Sounds more likely that she was just bound to get the allergy anyway. Giving the children exposure to the allergens early decreases the risk. It does not eliminate it 100%. Doubt not feeding her everyday peanuts was what made the difference.
Yeah if the kid's immune system was so sensitive to peanuts they might have gone from 95% chance to 90% chance with exposure or something like that. I expect the population level risk would be heavily skewed by those with low sensitivity (who might benefit a lot from exposure) and those who wouldn't have developed an allergy either way.
>>I highly recommend being very rigorous about giving them the peanut exposure every single day
I honestly can't tell if this entire post is some kind of parody or what. That cannot be real - I don't know anyone or have ever heard of anyone basically force feeding their child peanuts to maybe avoid peanut allergy later in life. It sounds insane, just like the presumption that because you missed some imaginary time window in their development your daughter has developed peanut allergy. That cannot possibly be real.
Living with a deadly allergy for the rest of one's life is no fun at all. A large part of social life is eating together in various locations where the allergen may not be so strictly controlled. Either one faces an easy death weekly or one opts out of many social activities. It is awful, and not wanting that for one's child seems natural to me, not insane.
I deliberately expose my child to a lot of things I want them to have in their life: climbing, swimming, the game of go, Unix command-lines, Newton's laws, musical instruments, etc. Doesn't seem odd to add peanuts to the list. (Well, for us it is logistically inconvenient because another member of the household has a deadly allergy, but if it weren't for that it would be sensiblre.) Not insane either.
Is it the idea of exposure leading to lower prevalence that sounds insane? That's been relatively strongly established in randomised trials. Not insane.
Some parents seem to perpetually live on the verge of an existential crisis for fear that they might do or not do something that will forever scar or harm their child.
I think you are reading the parent comment wrong. They are highly recommending it because their child DID get a peanut allergy not because they MAY develop one later.
No I did read it that way. I understand perfectly that their child developed a peanut allergy, and I'm very sympathetic - but the assumption that if only they fed her peanut brittle within some specific time period would have avoided it is just pure fantasy, or wishful thinking at best. They are of course free to assume so and I am well familiar with the feeling of "if only I did things differently" that every parent gets.
It's not a silver bullet, but there are many studies (including the ones referenced in the article) that show that introducing peanuts in a consistent early childhood window reduces the likelihood of later developing a peanut allergy. I don't think this is "pure fantasy."
Our daughter recently developed EILO. It sounds silly and totally illogical, but more than once, I've found myself wondering if there is anything we could have done to have helped her avoid it.
So yes, that feeling just comes with being a parent, I guess.
I organized a toddler group. Trust me, that absolutely can be real. One mother in particular always seemed to opt for exactly the bad option, from sitting up the baby way before it was ready (-> long term increase of likelihood of back problems) to exposing it to sun without suncreme by choice "for tolerance" (-> long term high increase of likelihood of skin cancer) to force-feeding solid food way before the baby could cope (-> nothing long-term, I'm just surprised it survived that). Bad instincts plus outdated or wrong knowledge. Thinking there is some regular peanut diet to follow would have fit right in, as would have completely avoiding peanuts.
I'm very curious what the method they used to attribute this to the advice? I ask as I swear I saw something around that timeline talking about "trans fats" and how they were a possible culprit in a ton of nutrition related woes. Notably, in 2015 was when it was removed from the "safe" list and it was on the way out during this time.
It sucks, as I can't find whatever paper I thought I read that implicated trans fats in allergies. Searching "trans fats allergies" shows several. I'm assuming it was one of the main results.
So my question is largely, why would it be more likely that the advice is why allergies reduced? Seems if there was evidence that trans fats were leading to increased allergies, that removing them would be by far the bigger driver?
I have a vague but possibly false recollection that someone noticed that mothers who ate peanuts while breastfeeding had lower incidence of peanut allergies in the child. And they started pulling on that thread.
There's a certain wealthy area near me where restaurants ask first if you have allergies, and ice cream shops ask if dairy is ok. My wife and I always joke, "we're in that part of town."
> where restaurants ask first if you have allergies
Pretty common expectation in many countries. I was surprised to see this is not normally a thing in the US, given how we're led to believe how much you guys love to sue each other.
Is the joke that they are respectful with regards to allergies? Or am I reading a bit much of an attitude into your comment? Because it comes off as rude and tone deaf.
With a child that has PA on anaphylaxis-level and has had such an reaction a couple of times, and she has thusly built up a fear and anxiety, not being able to casually just let her attend b-day parties etc etc etc, I can assure you it's not a joke to us.
And no, we are not overly clean, in fact love going outdoors into the woods and getting dirt under our fingernails. Nor did we hold her off peanuts when small, her first reaction came when she just had learned to walk at about 10 months and ate a tiny piece found on the floor. And we as parents work very hard on trying to have a casual attitude towards life and work on her anxiety, and not let the PA define who she is or does. But then something like last week happens - those who make the food for school messed up her box of food and she ate mashed pea pattys and got really, really bad, worst in years. Boom, all her confidence in school down the drain.
It's heartbreaking, really. To find her have all that fear and pain, and we can only do so much to help her with that. And it's heartbreaking to see it being a joke to some. When I see such attitudes, I try to think that it comes from someone who is living a happy-path life, and well, good for you.
Thanks for coming to my TED talk, and smash that bell button.
The joke is that there is a group of people who are fashionably allergic to various things. Remember 15 years ago when suddenly everyone was celiac? I'm all for cutting junk carbs out of the diet, but this was something people were just "discovering" for themselves. It always seems to be some sort of health-nut crowd which is often far more vocal than those actually suffering.
I never heard people self-diagnosing as coeliac, that would be ridiculous - diagnosis for that requires both a positive blood test for the antibodies and then a gastroscopy/biopsy. The 'trendy' crowd tended to be self-professed as "gluten intolerant" or "gluten sensitive".
Shouldn't you as a customer be in control of which food is served to you? If you have an allergy, ask for the component list and then decide what to order.
Like in rich neighborhoods people cannot talk and should be babysat by the serving staff.
Is it really healthy for people to fret over minor allergies they probably don't even have, and pass that anxiety onto their kids? And can't this behaviour even cause allergies, and cause people with real allergies to be taken less seriously.
You know who "those people" are, don't pretend you don't. You just don't want them mocked either because they make you feel comfy or because you get mistaken for them.
18 years of restaurants checking in, and we did everything we could to be respectful of allergies but the onus was always on the customer to tell us. Now when I go to places to eat they ask me, which is a different approach and like comment OP one I find much more often in more expensive restaurants. I think you're reading way too deep into this comment. What do you think, chat?
I just think it seems hugely inefficient for a waiter to ask hordes of non-allergic people for allergies for every one allergic person that could just as well announce theirs.
So what? It's a trivial kindness and it takes a few seconds. What else were they going to do with those seconds? It's different if it's fast food, where maximizing people I/O does matter. But that's not the case here.
Further if the restaurant asks ahead of time, that's a signifier that they take it seriously. If you have to tell them, it's much more likely you will encounter cavalier treatment of cross-contamination and such. For some people, that really can be life or death.
The joke is on parents who have made it a class signifier that they can afford to be more involved in their childs' life. This then extends to the business they frequent.
What is heartbreaking for me is all the wasted effort and pressure parents are putting on their children for little tangible gain.
I don't think there's a joke here, or at least I'm not reading it this way. If anything it's not about "weaklings" but about being in an area where people are more likely to sue a business if they aren't warned beforehand - a street vendor elsewhere will not ask because their risk profile doesn't include being sued by someone who bought ice cream from an ice cream truck.
Like if you walk into a store and they offer you coffee or even a glass of prosecco, I would also say to my wife "oh we're in that kind of store now" because you know you're about to be ripped off in some way. Not that other stores are for weaklings.
In this case, the joke is that someone with a dairy allergy could somehow buy and eat ice-cream without knowing it contained dairy.
It's just a completely ridiculous thing to check. Like warning that the boiled peanuts contain peanuts, or that a pencil sharpener should not have fingers inserted into it.
Possibly something more like "people who have non-imaginary dairy allergies aren't likely to go into an ice cream shop, and even if they do, they obviously won't order without emphasizing their dairy allergy".
It's kind of similar to the Whole Wheat Bakery asking you whether you're OK with gluten. If you aren't, you made a big mistake walking in.
Affluent areas in general have more variety. The ice cream shop may be a place where you can get all kinds of ingredients that you wouldn't find at other places. This is 100% true for "fine dining" and it's one reason why they ask.
They will also have substitutes for an allergy to make the experience just as pleasant, thus they ask.
Do they ask every single customer if dairy is okay? Because that was the original point... not that the ice cream shop also offered non-dairy versions.
It is very much a race and culture thing across Midwest. Dairy and cows have a connection to being "white Americans" all the way back to Old West and cowboys. A huge number of them are lactose intolerant (native American genes) but insist and even put family pressure on anyone who tries to move away from cheese butter milk dairy etc ..
In my experience there’s still much more to this. I’m sure it helps at population level like the article describes but it’s not foolproof. For our first we were feeding nuts early and still developed an allergy to all nuts. Our second didn’t get nuts until much later and he’s fine. There’s more to the story than timing, notably my first has eczema and asthma too so there’s that atopic march.
Allergy rate decreases with birth order. Of course, that's at the population level and probably not strong enough effect to notice if you only poll a dozen parents you know.
Something about this just reminds me of when I did a literature review in my anatomy class to address the question: "Is running bad for your knees?"
I had to decide which of two sets of peer-reviewed publications that contradict each other was least guilty using the data to support the conclusion rather than letting the data speak for itself and making an honest conclusion.
Compared to PhDs, MDs hate designing an experiment and would rather just extrapolate a different conclusion from the same longitudinal study by cherry-picking a different set of variables. The only articles I bother reading from the NEJM anymore are case studies because they're the only publications that consist of mostly-original information.
The fun part is realizing that any and all exercise comes with risks, and running probably is bad for your knees in the long term - but maybe the long term health benefits to the rest of your body of running outweigh the risk of damage to your knees.
Your personal health profile or family history may also put you at higher risk for cartilage degeneration from running, which would shift the balance in the other direction.
Blanket statements about medical outcomes like that are useful for medical practice in general, but can be misleading for individuals making health decisions if they ignore other relevant factors. There's also plenty of doctors who will not take those other relevant factors into account and just go by whatever the last training or research they were exposed to (which, incidentally, is also why big pharma companies invest in salespeople to target doctor offices - because it works).
Why “probably bad” though. If you plug that exact query into Google, numerous recent studies will tell you that it is probably good for long term knee health too (stress builds resilience unless prevented recovery time). Which studies are probably right?
"Probably" is being used here because the body doesn't have a really good way to rebuild cartilage, especially as you age, due to lack of blood flow into places like the patella. Knee and hip replacements are on the rise (https://oip.com/the-lowdown-on-the-uprise-of-knee-and-hip-re...) as well in Boomers, indicating that age related degeneration (with or wihtout a history of running) is fairly universal and expected.
There's absolutely some perfect middle ground of "just enough" running that will strengthen, but not deteriorate too quickly, your knees - but again where that point is will vary by individual. It also may not be something that can be determined except in hindsight, partly because medical professionals generally don't start monitoring cartilage until the person is reporting pain or mobility issues (or a known condition they're checking for symptoms of).
Point being that statistically there are useful trends in aggregate data that can be observed, but, paradoxically, those trends don't necessarily translate to good general medical guidance. One counterexample where those trends do translate would be something like that peanut allergy study from 2015 that was linked on HN recently about introducing allergens earlier and frequently to babies, resulting in fewer teen/adult allergies.
> because the body doesn't have a really good way to rebuild cartilage
Okay, but I’m still noting that if you google this exact claim, numerous recent studies found that running is found to build cartilage, contrary to past assumptions
You can get almost all of the health benefits of running from walking (weight loss, cardiovascular performance, etc.), it just takes much longer. Also, running is better for the bones (but worse for soft tissues).
Cycling/spin can yield more aerobic intensity with less stress on knees, you can't even get to Zone 2 with walking unless you're very overweight. Of course there are bodyweight options like aerobics, shadow boxing, jumping rope.
Skipping rope would be my favorite were it not for the fact that you need a lot of headspace for the rope. This makes it unviable except outdoors or at a gym/facility.
Not what this conversation is about but anyone running and worried about their knees should consider doing a little cycling. Dooesn't have to be fast or high resistance, but it does supposedly "massage" your joints without impact and help cartilage recovery. I definitely noticed a difference with myself and about 2 dozen clients with knee issues from running intensely (military, athletes etc)
As somebody who uses both, I personally think a rowing machine is better for general cardiovascular exercise than a bicycle. The work is better spread over your whole body instead of mostly the legs. You can get cheap ones with magnetic resistance that work fine for exercise purposes (the main advantage of more expensive rowing machines is more accurate simulation of rowing in a real boat).
Cycling is however a lot more interesting if you have somewhere good to ride.
If I had to load a boat onto my roof rack and drive to the nearest river at 5am every time I wanted to exercise, I'd do it once a week at the absolute maximum. I don't think it's a reasonable way to exercise for most people.
This is true. It's also why I moved to live near water. But, a lot of people in the rowing club do exactly this: row 1-2 times at most a week.
Rowing machines are fine. I'm not sure why they have a he-man scale going up to 11 when the on-water experience is mostly below 4, but I guess people need goals. Bad back goals.
Skipping rope is also a great option. Cardio is up there with running and it's not as hard on the knees. We usually start every session at my muay thai gym with it, and whenever I travel I'll just throw a cheap rope in my bag.
The question seems really poorly formed! Like there’s never going to be a binary answer to a question like that. The answer is always going to be “it depends” on for example the volume, your physical attributes, recovery, genetics, age etc
I'll give you the principled answer and the cop-out answer. Here's the principled answer:
That was just the catchy title, similar to peer-reviewed literature reviews published on nih.gov: not necessarily the creme-de-la-creme, just good enough to pass peer review. I real question is whether concern for cartilage erosion is well-founded, and whether or not it outweighs the scientific consensus that running improves bone density of the tibia and fibula. Again, literature had strong evidence for the latter while the former was still a major controversy in kinesiology.
I didn't even touch cardiovascular health, because to be fair we live in a world with bicycles and affordable YMCA lap pools.
Here's the cop-out answer:
It's a literature review- the very requirements are merely one step removed from those blog articles Harvard Medicine publishes for mass-consumption. I followed instructions, one of which was to adhere to a maximum of 2 1/2 pages, and I got a Northwestern 95 on the assignment.
For people that like nuance and details yes. But the point is, most people don't want that, they want to be told what to do or make a binary decision: Good or bad.
FWIW I tell people that running is bad for your knees, but relative to other exercises! If someone wants to only run, then go do it... better than nothing.
There was a period of a few decades (I guess still ongoing, really) where parents sheltered their kids from everything. Playing in the dirt, peanuts, other allergens. It seems like all it's done is make people more vulnerable as adults. People assume babies are super fragile and delicate, and in many ways they are, but they also bounce back quickly.
Maybe part of it is a consequence of the risks of honey, which can actually spawn camp infants with botulism. But it seems that fear spread to everything.
Not to confuse things: There quite simply is a long list of things that can kill an infant and we get increasingly better evidence for what's on there and what is not. Avoiding death at all cost is ludicrous, but for a child born in the 1950s in high income countries the mortality rate was ~5%. 1 in 20 kids dead before the age of 5. For contrast, now it's closer to 1 in 300. That's not a coincidence but a lot of compounding things we understand better today.
Are there missteps? Certainly. Figuring out what is effective, what has bad secondary effects (fragility, allergies etc) and what is simply wrong is an ongoing effort and that's great, but less dying is a pretty nice baseline and progress on that front is inarguable.
To be a bit morbid, one could also explain OPs observation that "people are more fragile" by the lower child mortality by the hypothesis that these more fragile people wouldn't have made it through infancy before.
I don't particularly believe this, but it fits Occam's razor, so it seems to deserve some examination.
Occam's razor is basically (paraphrased) "given two explanations where all else is equal, the one with the fewest added assumptions is most likely true." Based on that Occam's razor is already out the window because all else isn't equal.
Also this "more fragile people" argument assumes the "fragility" is both inherent and of a lifelong kind. This ignores that most causes of infant mortility are external, and that for many of those being exposed to them results in a lifelong increased mortality risk. Excessive hygiene leading to more allergies is a direct example of this.
It was implicit, at least to my eye, that other explanation which was being offered a counterpoint was the grandfather comment.
For clarity, I will include both here:
The two explanations for increased adult fragility are:
forgotoldacc> Parents shelter their children too much and have created adults that have additional allergies as a result of lack of childhood exposure
rocqua> Increased sheltering of children has allowed more of the fragile ones to survive to adulthood, increasing the number of fragile adults we observe today.
What’s this increase in fragile adults you’re talking about? Are you sure it’s a real thing? Are you aware how staggeringly high rates of institutionalisation were in most western countries in the early to mid 20th century? And then there were the adults who were considered ‘sickly’. Like, _fainting_ wasn’t considered dramatically abnormal behaviour until quite recently.
A lot of people who today would be considered to have a condition which is entirely treatable by doing (a), taking (b), not doing/avoiding (c), etc, would, a century ago, have just been kind of deemed broken. Coeliac disease is a particularly obvious example; it was known that there was _something_ wrong with coeliacs, but they were generally just filed under the 'sickly' label, lived badly and died young.
(And it generally just gets worse the further you go back; in many parts of the world vitamin deficiency diseases were just _normal_ til the 20th century, say).
That makes a huge amount of assumptions but also wouldn’t fit their experience. If it was this then it would add a few percent of the population being “more fragile” and I’d wager they see it as a broader trend.
Intuitively, this does make a lot of sense, and it's easy to make an argument that if civilizational progress continues, in the far future people will in general have very weak bodies, simply because reliance on medical equipment won't be an evolutionary disadvantage.
I think most of the change in death rate is improved medicine (and maybe wealth too – plenty of people in the US in the 50s were very poor by modern standards) rather than parents knowing about many potentially harms. (Maybe I’m wrong? Happy to be corrected here)
This is the conclusion I lean towards, but anecdotally one of my grandparents knew something like 3 or 4 kids who died before the age of 15, all in preventable accidents. Disease got at least a few more. It’s possibly just a coincidence but hearing the stories of how inattentive people could be to their children back then, I’ve always suspected current helicopter parenting norms must have accounted for at least some of the decline.
There’s been a similar shift with people letting their dogs roam free. When I was a kid I remember hearing stories about a dog getting run over by a car every year. I rarely hear these stories anymore because people usually keep their dogs supervised or in a fenced yard. I don’t have any hard data, but I suspect there’s something to these cultural shifts.
Vaccinations and better antibiotics reduced death rates a lot, but in 1950 accidents were still 30% of the death rate for children, killing 5 times as many children than die today for all causes.
The death rate for children aged 5-14 is is 14.7 per 100,000, i.e. 0.0147%. That's basically zero and five times that much is still basically zero. By comparison, the death rate for the 35-44 age group was 237.3 per 100,000.
Also, the most common type of accidental death is car accidents. So is even that difference from kids not getting to play outside anymore, or is it radial tires and crumple zones?
> for a child born in the 1950s in high income countries the mortality rate was ~5%. 1 in 20 kids dead before the age of 5.
Essentially all of this was infant mortality, i.e. kids who died before the age of 1, and that in turn was more related to things like sanitation and vaccines and pre-natal screening.
But if you got into an accident, wouldn't antibiotics help with the injury, surgery etc.? A bad burn could get infected etc. And possibly similar for some birth injuries and birth defects, and bacterial pneumonia for sure.
Rational and science might be pretty far apart. Flying a key in a thunderstorm for example isn't the most rational decision. Neither scraping open your family's arms and applying cowpox pus.
Pretty irrational, but definitely celebrated.. eventually
If the best available means to perform an experiment carries some risk, it could still be entirely rational to do it rather than forfeit the knowledge gained from the experiment.
For example take the famous mask debate. It could easily be solved by having volunteers willing to stand in a room with people with covid at various distance, each using randomized masks/no mask. There would be plenty of volunteers for such a study but there's no way it would be approved.
The FDA doesn't count lives lost due to inaction and slow approval of new drugs and treatments. As Munger always said "show me the incentive and I'll show you the outcome." By any rational calculus, that one Thalidomide win by the FDA has caused incalculable death, pain and suffering by pushing out the timeline on not only recently discovered cures but all those built on top.
Imagine for example the number of lives saved if GLP-1 was purchasable over the counter in the 1990s when it was first discovered.
There seem to be some quite powerful forces acting in the opposite direction - social media maximising engagement by pushing divisive stuff and politicians trying to demonize the other team. Not quite sure what the answer is. I feel there should be some tech type solution. At least LLMs at the moment by taking in the whole internet seem fairly neutral although Musk seems to be trying to develop right wing versions.
One could argue that science being celebrated too much leads to this type of present-day outcome. Science can tell you how to do something, but not why, or even what we should do to begin with.
It’s not just save as many lives as possible at all costs, saving 20 kids but 2 will develop debilitating peanut allergies isn’t worth it. Progress must be done slowly ensuring no harm is done along the way.
What on earth are you saying? It's better to kill 20 children than to risk that 2 of them develop peanut allergies? I don't see how this can even begin to be an arguable position to take. And that's ignoring the fact that it isn't even a correct assertion in this case.
They’re not mutually exclusive options, we can save the 20 kids safely while having a mindset that values doing no harm.
Telling anxious parents to have their kids avoid peanuts caused harm that wouldn’t have happened otherwise. I guess it’s valuable to better understand allergies, but learning at others’ expense isn’t worth it.
I clearly misspoke and people are misunderstanding my point, which is only that “hurting people is worth it” is a horrible argument and shouldn’t be a valuable thing, we can and should save the 20 kids without causing harm to the 2
doing nothing is better than something if that something might hurt people without understanding how and why
What specifically do you disagree with? I’ve explained it three different times now and can’t delete my original comment so please let me know
This research shows physicians harmed kids recommending they avoid allergens like peanuts, is that something we should ignore because all the benefits of science are “worth it”?
Science is amazing not because it’s always right, but because it (should) strive to always do better next time
All you're fucking doing is saying "Don't save a million people of 1 person is going to be harmed" OR the utterly trite point of "wouldn't it be great if everything was magical and no one was harmed by anything ever".
What you’re describing is called utilitarian ethics, the exact tradeoff is called the trolly problem. Ethics is much more complicated than a single comment thread
“it’s worth it” is a horrible argument when people’s health is on the line.
"What doesn't kill you makes you stronger" makes for a fun little statement. It's not actual natural law though, right? I feel like it's fairly well documented that good hygiene is a win for humanity as a whole, so I have some skepticism for generally saying "well let the kids eat dirt". We did that for centuries already!
The thing I'm a bit curious about is how the research on peanut allergies leading to the sort of uhhh... cynic's common sense take ("expose em early and they'll be fine") is something that we only got to in 2015. Various allergies are a decently big thing in many parts of the world, and it feels almost anticlimactic that the dumb guy take here just applied, and we didn't get to that.
Maybe someone has some more details about any basis for the original guidelines
Speaking as someone who has had a lot of experience talking with doctors in poorly-understood clinical situations over the years, most doctors display a need to establish informational authority over their patients.
So if the "dumb guy" take is "just expose the kids, they'll adapt to it", in the absence of hard evidence to the contrary (and maybe even with it) the average doctor is going to _reflexively_ take the opposite position, because that shows that you (or the conventional wisdom) were wrong.
There are exceptions, and they are either the ones that just don't care at all, or they're the best docs you'll ever find.
A justification I read once is that the human immune system evolved to deal with a certain amount of pathogens. If you don’t have enough exposure to pathogens, the immune system still tries to do its job, but winds up attacking non-pathogens.
> Various allergies are a decently big thing in many parts of the world
Maybe we live in bubbles.
I am from Asia. I have only seen people need to be taken to emergency hospital in American tv shows for any allergies. Here I've never seen it in my whole life and didn't even know allergy can be this dangerous. We don't have peanut allergy too. First time even I saw it in TV, I was very confused.
Allergies do exists here, but "not to the extent" like what I've seen in American TV shows or heard online.
Only thing I remember is people need to take medicine for to allergy from venomous caterpillar hairs, they mistakenly touched those. And stung by honey bees, wasp etc.
I think this is selection bias. I know plenty of people in Asia who have plenty allergies to some degree or another (selection bias on my side as well)
Hell, most of hayfever season in Tokyo is a bunch of people with allergies!
I think you should remember that American TV shows will use certain kinds of extreme scenarios to make a story. Lots of people who are allergic to things in a fairly benign way.
And also just more generally, I think Americans will be more likely to identify that _they have a shrimp allergy_ when every time they eat shrimp they feel bad. But I know plenty of adults who just go through life and be like "I guess I feel sick every time I eat this" and not be willing to use the word "allergy".
Or maybe the prevalence of peanut allergies is really low.
A quick google search says Asians populations have more allergies to buckwheat, royal jelly, and edible bird nests from swiftlets. Shellfish is still one of the highest allergies anywhere.
Same in a decent chunk of Europe too. Allergies exist, but are rare and more of the type where you're not quite sure you believe the person telling you they're allergic because it hadn't even occurred to you there can be an allergy for that. Like tomatoes, peppers, raw carrots.
The UK seems to be a bit of an exception. And it shows, the only two countries I've been asked if there are any allergies by waiters as a standard are the US and the UK.
If it makes you feel better I’m nearly 50 and I have never in my life heard of people needing to take allergy medication for mistakenly touching caterpillar hairs.
That one incident was serious, the person slept over a caterpillar getting stung all over body. Here all caterpillars have venom in hair. Personally I've touched many times by mistake, but didn't have to take medicine, the itchiness & swelling goes away within a hour.
I think a lot of the delay is it took a while for people to realise there was a problem. The perhaps excessive hygiene thing didn't really get going till the 1960s and so you didn't really see the rise in allergies till a couple of decades after, then maybe scientists started figuring it like in the 90s and then it takes a while to get proven enough to recommend to parents?
What doesn't kill you postpones the inevitable. Sometimes it makes you stronger, often it makes you weaker. E.g. if your arms get amputated you're extremely unlikely to break your bench press personal best afterwards.
Sorry I might have expressed myself badly, I get it works sometimes but it's not a hard law for "everything", even if... maybe it's a good default? maybe?
Not true generally. For example, catching Measles can wipe out your immune system thereby making you more likely to get sick. Other pathogens can also work this waym
"Mithridatism is not effective against all types of poison. Immunity is generally only possible with biologically complex types which the immune system can respond to. Depending on the toxin, the practice can lead to the lethal accumulation of a poison in the body. Results depend on how each poison is processed by the body."
"A minor exception is cyanide, which can be metabolized by the liver. The enzyme rhodanese converts the cyanide into the much less toxic thiocyanate.[12] This process allows humans to ingest small amounts of cyanide in food like apple seeds and survive small amounts of cyanide gas from fires and cigarettes. However, one cannot effectively condition the liver against cyanide, unlike alcohol. Relatively larger amounts of cyanide are still highly lethal because, while the body can produce more rhodanese, the process also requires large amounts of sulfur-containing substrates."
Our immune, metabolic, and other systems are built to be adaptable, and some things are easy to adapt to, but other things are difficult or impossible for them to adapt to.
While that deals with deliberate poisoning, when it comes to environmental contaminants such as lead and other heavy metals, or PM10s from vehicle exhausts, the other by-products of coal power stations and wood fires etc. I suspect that long-term exposure to these is not something where "you can build a tolerance" is a useful framing at all. Even if you technically do, it's irrelevant to the harm caused over time to whole populations.
The post that I am responding to does in fact deal in absolutes by asserting that "What doesn't kill you makes you stronger" is a natural law. Please don't troll by attributing that to me.
There are tons of counterexamples. Chronic traumatic encephalopathy. Limb amputation. Displaced bone fractures that are never set. Crohn's Disease. Being in a coma for six months and losing all of your muscle mass. Third degree burns over 90% of your body. Plenty of things that don't kill you also don't make you stronger.
The funny thing about trying to apply this logic in reality is that it often breaks down in ways that can be really, really bad.
I've brought up this example many times before, but Measles is a great example. Measles resets your immune system and breaks immunological memory for anywhere up to three years after having recovered from it. But now we have a bunch of people that assume any diseases can simply be dealt with in a natural way by your immune system thanks to the logic above, and well, the consequences of that are becoming clear.
> "What doesn't kill you makes you stronger" makes for a fun little statement.
It comes from a philosopher, talking about something that is completely not related to health-care, and ironically a strong criticism claiming that people that say things like that are stupid by one of the people most vilified in history by being misunderstood when claiming that things are stupid.
I always think about how animals eat - basically their food is never clean and always mixed with dirt. Evolution dealt with this problem since forever.
Plenty of stuff is poison to animals as well as humans! Lots of animals get sicknesses and like parasites from everything they eat.
Like with humans, though, animals have immune systems which help. This is the trouble with food hygiene arguments: you can eat "dangerous" food and 99% of the time be fine. But it's still good for people to not roll the dice on this stuff, even with a 1% hit rate. We eat food 3 times a day, so that's potentially 9 very adverse events per year!
"Yeah I get food poisoning once every month or two" is a thing that some people live through, and I do not believe they enjoy it. I have not have food poisoning for a very long time, and appreciate it a lot.
And one of the ways evolution dealt with this problem is evolving intelligence the can then tell you to improve hygiene practices to reduce the "natural" death rate
My dog will eat literal street crap at the first opportunity. She’ll also just throw it up on the carpet 2 hours later if she’s not feeling it. Not sure that’s a really an improvement.
I have a great example of this. For our first kid, we had created a Sterile Field in our kitchen for pacifiers, baby bottles, etc. The sanctity of the Sterile Field was never violated. We would wash things by hand and then BOIL them and place them into the Sterile Field. This kid is allergic to tree nuts and a few other things.
For baby number 2, soap and water is enough. There's no time for Sterile Field nonsense. This kid isn't allergic to anything.
There was a local mom who had 4 thriving kids. When their baby dropped the pacifier in the dirt, it just got brushed off and handed back to the baby. I don't think those kids had any allergies.
I've not seen a lot of research about how allergies develop as you get older.
For me, as a kid: very, very allergic to cats, kinda allergic to many food items and a little to horse hair (only noticable when shedding in the spring)
As a young adult: Only 2-3 food allergies remain, cats still strong, hayfever starts.
Then I took some shots against the hayfever for 2-3 years, and the cat thing has mostly improved and the hayfever is basically gone. So only 2-3 food items remain.
As an adult I developed an allergic contact dermatitis reaction to some sulfates (sodium lauryl sulfate definitely, sodium laureth sulfate definitely, and something in raw onion juice) after a bad burn on one of my fingers. Probably due to exposure while it was healing, since it's in a lot of soaps like Dawn have one or more of the two. Self-testing to find a soap that didn't blister my hand and then to narrow down which ingredients caused the reaction was a long & unpleasant process. So it's definitely possible to develop new allergies as an adult, as well as to lose existing allergies.
The thing is, the sterile field is actually very important... for the first 3 or so months though. The immune system isn't developed enough yet and many medicines cause more harm at such a young age.
However this doesn't need to continue very long until basic cleanliness and medicine can be used effectively without harm.
It seems like all it's done is make people more vulnerable as adults.
In 2000, the American Academy of Pediatrics recommended not allowing your kids peanuts until they were 3 years old. It was just parents following doctor's (bad) advice.
Not to confuse: peanuts cannot directly be eaten because of risk of choke, as infants cannot chew them. The advice is to add as ingredient, as e.g. peanut butter.
They were advised against because of the allergy risk, not because of choking hazard. Are you a parent? No shit you don't give hard nuts to a baby with no teeth.
A timely reminder that although doctors aspire to follow science, and many doctors are scientists, and most doctors advocate evidence-based medicine, the practice of medicine is not a wholly scientific field, and particularly the big associations like the AAP are vulnerable to groupthink just like any big org.
Also, science is persistently incomplete, and actually making decisions (or advice) requires making assumptions (often, neutral ones, but that can turn out to be quite wrong) about what is in the unfilled gaps. The advice to avoid peanuts was because it was clear that severe peanut allergies existed, it was clear that they affected a small fraction of children, and it was clear than when they affected very young children, those children weren't able to let people know what was going on as well as older children and adults to enable timely intervention.
There wasn't much information one way or the other on what avoidance did as far impacting development of allergies, and with the evidence available, delaying exposure seemed prudent.
I’d argue that the fear you speak of spread because it was profitable. I hit the 90’s in my mid-teens and boy howdy did it seem like every news outlet, especially the local ones, had their sites set on making us terrified to eat or drink things we previously consumed without much thought. Fear gets viewers, which is how revenue is generated, so there’s an arguable conflict of interest there.
The real problem is some of those claims and reports were true, but we were so inundated with the rhetoric that everything was going to kill us that many of us sort of lapsed into apathy about it. Stepping back, the food industry in the US clearly does not have consumer health at heart and we struggle to find healthy options that avoid heavy processing or synthetic fillers. Those parents who sheltered their babies back then may have been on to something when it came to stuff we consume and we should have been on the path to demand better from our food sources had more of us been more diligent with our grocery choices (myself included, at the time), but instead we ended up with bread that lasts unnaturally long and has an allowable amount of sawdust as an ingredient.
I'll wager that more children and adults have been killed by assault rifles and oversized vehicles over the past few decades than have died from a peanut allergy.
That kind of assumes they are sheltering kids, but to be honest peanuts aren't really that common a food, certainly not in foods you would commonly give a four month year old child.
In America and much of Asia, peanuts are incredibly common. This is like an Indian person saying beef isn't a common food. In your country, sure. The rest of the world? No.
Infants in SE Asia are probably getting near daily exposure to peanuts.
[According to google] My country has a per capita peanut consumption of 1.4 kg per person per year vs america's 2 kg. So not that different.
I still maintain its mostly in foods people don't generally give to toddlers. People may give a PB & J to a 5 year old, but they don't generally feed that to a 6 month toddler. Not because they are protecting them from peanuts but because generally people dont give sandwiches to toddlers.
A big reason that the effect of avoidance was hypothesized and the studied and nailed down is because (even when avoidance became common in the US), peanut-contain snacks were (presumably, still are, it wasn't that long ago) a very common food for very young kids in Israel.
Yes, there are some counterexamples. Bamba (peanut-butter-flavored puffed maize) in Israel is one, worth studying as it is commonly given to very young kids.
But generally speaking, the USA is an outlier on the prevalence of Peanut Butter specifically, and to a lesser extent peanuts in general.
Exposure to microbes and potential allergens relevant to the hygiene hypothesis doesn't seem likely to have changed very much - it's not like people started keeping their babies in sterile bubbles. While lots of wishful thinkers jumped on the concept in recent years, the hygiene hypothesis doesn't apply to disease-causing pathogens like COVID or the flu. But yes, will be something to pay attention to, considering the massive volume of COVID infections and COVID's negative effects on the immune system.
I grew up in a smoking house. We didn't have any house cleaners. We wore our shoes in the house. I spent my childhood outdoors playing in the dirt. When we were thirsty we drank garden hose water or went inside for some Kool-Aid.
> where parents sheltered their kids from everything. It seems like all it's done is make people more vulnerable as adults.
I don't agree that this is "all" that it has done.
There are many cases where reducing exposure as much as possible is the correct thing to do. Lead is the best-known example.
As the other reply pointed out, the second-order effect, the nuance that comes later is that sometimes this isn't the right thing to do.
But it would be basically incorrect to reduce it to blanket, binary, "all good" vs "all bad" black-or-white conclusions, just because the there is a smaller course correction when it's found out to be not entirely good. Concluding that "all it's done is cause problems" is a knee-jerk reaction.
Most likely you know already, and if that's the case just ignore this comment please. Spawn camp in this context is referred to gaming terminology where it indicates an enemy that camps/waits for for a long time and kills you as soon as you are put in the battlefield, which is your spawn point, hence spawn camping
> There was a period of a few decades (I guess still ongoing, really) where parents sheltered their kids from everything.
The hygiene hypothesis is not impossible, but evidence for and against it is questionable. But anyway, for peanuts it's not the hygiene.
It's a much more complex mechanism that retrains your immune system from using the non-specific rapid-response allergic reaction to the T-cell-mediated response.
The same method can be used to desensitize yourself to poison oak or ivy. You need to add small amounts of them into your food, and eventually you stop having any reaction to urushiol contact with the skin.
"[eg] women aged 30–34, around 1 in 70,000 died from Covid over peak 9 weeks of epidemic. Over 80% pre-existing medical conditions, so if healthy less than 1 in 350,000, 1/4 of normal accidental risk"
The biggest reason I took covid19 seriously was because many countries in separate parts of the world took drastic measures, unlike nut allergy which is the poster child for first world problems.
I don't understand why a quotation - a straightforward summary of factual information about the virus and its low risk to a specific group, written by a professional statistician and University of Cambridge professor - is still considered contentious or triggering to some people, even five years later.
Aside from the skin lotion thing[1] that got popular recently, what is the state of the art in 2025 for allergy prevention? It feels like there is a lot of common ignorance in this space but literature is full of better practice.
A relative has tried acupuncture therapy for their kid, and says it works wonders! Never heard of it, you would have sworn it was crank magnetism when you read up on it; but they swear up and down about it for their kid, and I've personally witnessed the kid being introduced to food items that they were previously severely allergic to - with very minor and easy-to-mitigate issues.
This world makes little sense, but I guess I'm here for it!
>A decade after a landmark study proved that feeding peanut products to young babies could prevent development of life-threatening allergies, new research finds the change has made a big difference in the real world.
I am sorry, but am I going crazy?
We have been giving infants small amounts of peanut butter, egg etc... for decades where I live. But also let them play outside, get dirty put stuff in their mouths to train the immune system.
Sometimes things are common knowledge, but don't necessarily have longitudinal studies to back them up. I think a significant number of people have thought it to be common knowledge, but now there are large studies backing this up as well.
Here is another study, as early as 2008 that shows similar results:
Objective: We sought to determine the prevalence of PA among Israeli and UK Jewish children and evaluate the relationship of PA to infant and maternal peanut consumption
Nutritional science have unfortunately been pretty bad at the science part for a rather long time.
There's a dark pattern hiding in the modern era where we assume hard evidence to exist where it doesn't, a projection of CAD engineering onto idle theory crafting and opinion.
For any parents wary of trying to think up a way to implement this yourselves: don't. Someone already neatly packaged it up and removed the thinking from the process. (protip: feed it to your baby in a hospital parking lot)
I wonder why the old advice was being given if it was so wrong? If nobody understood what to do, shouldn't there have been no advice instead of something harmful?
You seem to be suggesting that doctors should not suggest any health precautions until controlled experiments have found them effective. That is the position taken by the highly-cited paper "Parachute use to prevent death and major trauma related to gravitational challenge: systematic review of randomised controlled trials", which you must read immediately, because in a peculiar way it is a paper about you: https://pmc.ncbi.nlm.nih.gov/articles/PMC300808/
You don't need a controlled experiment if you have a good enough understanding of the mechanism, such as with parachutes. But since they apparently had no idea how peanut allergies worked nor had any adequate studies, they should have just shrugged their shoulders when asked for advice.
Even with parachutes, you could do a study (not a RCT) by looking at historical cases of people falling with and without parachutes. The effect would be so strong that you wouldn't need those clever statistical tricks to tease it out.
Lots of things kill infants that harm children, so keeping them away from things that harm some children probably seemed correct. The mechanism for allergy development wasn’t well known and it seems reasonably to avoid it in case it was genetic or something and would cause a hard to treat allergic reaction in the infant.
If people are developing allergies to food, isn’t a logical first step to not expose babies to the allergens? It seems logical. It turns out to be exactly backwards.
It would seem logical, until you learn what allergies are. They are the body's immune system overreacting to something that would normally be harmless, and acting as if it's an invading pathogen. Once you learn that, then realizing "hey, expose the body to this thing early on, and the body's immune system will treat it as normal" is a logical step.
If this theory (that early exposure teaches the immune system not to overreact) is right, then another logical consequence would be that kids who play outside in their early years would have fewer pollen allergies than kids who mostly play indoors and are exposed to far less pollen than the outdoors-playing kids. I don't know where to look for studies to prove or disprove that thesis; anyone have any pointers?
Well, I mean, did you know that skin exposure can sensitize and oral exposure builds tolerance? I certainly didn’t. That’s a subtlety of the exposure game that I did not know.
E.g. from age 27 weeks my daughter has played in a little herb garden full of mud and grass I built for her. She grabs and eats leaves from the herb plants (the basil is entirely denuded so that’s a complete loss). At first I just wanted her to play in the garden out of the same naïve exposure to tolerance model. I never would have considered that skin exposure is different from oral exposure. As it so happened she ate the plant leaves and it doesn’t matter either way since this part of immunity (to microbes here) doesn’t work in the same way as peanuts anyway.
There is a joke that the book "Immune System 101" is 1000 pages long. Meaning the immune system is one of the most complicated systems in biology, simple logic arguments like yours above rarely apply, everything needs to be tested to be sure.
Bad advice that has a very long return on investment is quite sticky.
For instance the "cry it out method" did massive amounts of psychological damage to more than one generation, but it seemed to work in the short term as the babies eventually learned to "self-soothe".
Even now I still see parents and grandparents suggesting it in parenting groups; and taking extreme umbrage at the idea that it might have damaged them/their children.
And the variations on "a little spanking", "spare the rod", "dad would take us out behind the woodshed"...
Careful studies have shown that violence used with children percolates back out of them, in rather rapid fashion. Something like a great majority of them go on to use violence to interact with others in the next two weeks.
So, yes, as it turns out: a little spanking did hurt... specifically, it hurt innocent bystander kids.
It's too big of a topic for a HN comment but do a google or LLM search and see. One widely-accepted aspect is that a child can not "self soothe" until 5-7 years. It's not developmentally possible, and using that language is a bit of a PR move to gloss over what is actually happening.
People did understand what to do, it just turns out their understanding was wrong. We might still be wrong though, one study isn't definitive proof of anything. We have to make decisions with the knowledge we have at the time, and it's normal for those decisions to look dumb in hindsight.
The 2000 guidance was based on expert opinion because there were no studies. Leap was published in 2015 and it gave the first level 1 data on peanuts.
Anaphylactic shock is scary and peanut fear was a big deal in the late 1990s but actual risk of harm was very low. The guidance was more about the psychosocial burden placed on parents when there was no guidance. Anxious parents have been studied, that mechanism is reasonably well understood and that harm can be quantified.
Hindsight is 20/20. The fact is that thousands of children were dying and public health officials were set to task to identify interventions that help.
They know that skin and mucosa sensitization can occur in response to allergens.
A reasonable hypothesis is that there’s some boot-up process with the immune system that needs to occur before anything happens. The kids are dying today. “Avoid the thing that can cause sensitization” is a conservative position.
It is unusual that it should have been opposite and that oral exposure induces tolerance. It’s the fog of war.
The standard conservative intervention has helped in the past: I’m pretty sure seatbelts didn’t have strong mortality data before they were implemented. If it had turned out that more people were killed by seatbelts that trapped them in vehicles it would make for a similar story. I think they also got rid of all blood from donors who were men who have sex with men during the initial stages of the HIV pandemic (no evidence at the time).
Edit for response to comment below since rate-limited:
Wait, I thought it was on the order of ~150/year people dying from food anaphylaxis though I didn’t research that strongly. It was off my head. If you’re right, the conservative advice seems definitely far too much of an intervention and I agree entirely.
you should really educate yourself on lactose intolerance, or really how you view medical conditions in general. being a "bitch" or not has nothing to do with whether your body produces certain enzymes.
> Just one or two nights of pain
you shouldn't be allowed near children if that's your approach
Undoing of the effects of excessive and unnecessary social guidance takes ages.
At some point through the times of civilizations, humans started having less work to do and more idle people around. The idle people started spending their time for preaching a life style other than what was evolved naturally through centuries and millennia. They redefined the meaning of health, food, comfort and happiness. The silliest thing they did was creating norms, redefining good and bad based on their perception of comfort and happiness and enforcing those norms on populations.
Human race continued to live under the clutches of perceptions from these free-thinking idle people whose mind worked detached from their bodies and thus lacked the knowledge gained from the millennia of human evolution.
I think people seek out these restrictions on their own. Almost everyone I know has some sort of belief about what's healthy and what isn't.
Some people become vegetarians, some people become vegans, some people believe eating big steaks of red meat is healthy, some people avoid pork, others do not eat cooked food on some days of the week, others eat only fish on special holidays, some people tell you that yoghurt is good for your gut, others tell you to avoid dairy at all cost, some tell you to avoid carbohydrates, ....
Some of these are backed by science, some are batshit crazy, some are based on individual preferences.
I don't think this is a new phenomenon. People just love coming up with rules, and even if our society allows you to eat pretty much whatever you want, people still seek out restrictions for themselves (and their kids...)
You have left out the elephant in the room - the government controlling the food choice, healthcare, medicines, and overall a lifestyle. You don't have as much freedom as you would like to think.
What are you talking about? Food safety regulations like requiring milk to be pasteurized?
I think that's just common sense, but at least in my home of Austria you can still easily get un-pasteurized milk if you really want. I'm not sure how the "government" controls my food choices? (In some cases I would actually prefer more regulation, because some producers make some questionable choices. I would prefer to buy cured meat without nitrates, but it's quite hard to find)
Have you ever heard of obesity and variety of diseases that are mostly specific to some countries and their life style? If not you should travel to some third world countries. This is only to show you that your government is the biggest stake holder and controller of your life style, not you.
Yes, when the mind is over-confident of it's education and perceptions, it starts to disobey the signals from body and force the body to follow what mind says. That's when the mind loses the support of knowledge encoded in the body, the knowledge which wass collected through evolution.
The mind tries to compensate the loss with experimentation that can't undergo the same extent of evolution. Then it dictates body to follow the results of these puny and tiny experiments, and ignores the rich knowledge already encoded in the body.
>when the mind is over-confident of it's education and perceptions, it starts to disobey the signals from body and force the body to follow what mind says.
Isn't that one of the fundamental things being taught to nascent minds as a prerequisite to participating in society -- starting the earliest stages of development, at which point neither one's mind nor one's body really has much of a say in the matter?
A little bit off topic, but even after years of active interest, I'm still amazed by the complexity of the human immune system.
Imagine this: we are all born with a functional immune system which is pre-programmed with knowledge of what bacteria, viruses, and many parasites look like, so it can immediately deal with these without prior exposure. This is the innate immune system, and in many organisms is the only immune system.
On top of that, a database is created which consists of fragments of all our bodies' molecules. This database is used to train the adaptive immune system. The thymus will then present these molecules to new white (T) cells, and screen out the ones that recognize these "self" molecules. This is the adaptive immune system.
Still on top of that, there's another tier, because maybe 0.1% of T cells escape the first-pass screening. You now have a series of checks and balances which screen for these escaped cells outside the thymus, and either reduce their functioning or eliminate them entirely. This is peripheral tolerance (what the Nobel prize in medicine was awarded for this year).
And when there's an actual infection, this system is able to spin up a few VMs, run a large bug-search model, and create a pool of tailor-made antibodies and T cells specific to the new bug, which in most cases are enough to deal with the infection.
So when all is said and done, and the system is trained and working as expected, you now have an immune platform which, on top of recognizing all its own molecules, can also recognize pathogens, including differentiating disease-causing ones from the benign ones; can also deal calmly with the enormous diversity of things we put in our mouths, noses, and other orifices; and in most cases doesn't actually go rogue.
But sometimes, it can be overcome by peanuts.
A lot of science supports the idea that helicopter hyper-parenting is hurting kids by having them grow up in an environment that’s too sterile. Let your kids go outside and roll around in the mud a bit. God forbid they lick the floor. Science says it’s good for them!
The Hygiene Hypothesis, which is what you have described, fell out of favor in research circles during the late 2000s. What seems to be causing the allergy and autoimmunity epidemic is a loss of keystone species in the gut that have co-evolved with humans and provided essential services to the immune system, as well as stability to the entire ecosystem. See e.g. a famous review on this issue [1]. Other important factors might include novel protein antigens and small molecules we have not co-evolved with.
[1] Darwinian medicine and the ‘hygiene’ or ‘old friends’ hypothesis. https://pmc.ncbi.nlm.nih.gov/articles/PMC2841838
1. Early-life exposure to diverse, nonpathogenic microbes is linked to lower risk of allergies and asthma: "farm effect" (Amish vs. Hutterite study showing microbe-rich house dust associates with protection). 2. What matters is exposure to the right environmental and commensal microbes, not skipping handwashing or basic hygiene. Microbial diversity is good; pathogen exposure is not.
I don't think you need to be a doctor to come to a conclusion that a system exposed to more learning data is more knowledgable.
This issue to me seems to be entirely a study of psychology not biology.
Can't remember a single kid with a peanut allergy growing up in the 70s.
I had friends with nut allergies in the 80s.
Before the epipen, I imagine the mortality rate would be pretty high, and it didn't arrive on the market until 1983.
Prior to the EpiPen, people carried the Ana-Kit. It became commercially available in 1963 and was a little kit containing a syringe pre-loaded with epinephrine, antihistamine tablet(s?), and a tourniquet.
People in anaphylactic shock sometimes (often?) need more than one dose, and antihistamine should be taken asap. The epinephrine just bridges the gap until the antihistamine kicks in.
I liked the Ana-kit because the syringe had 2 doses in it (you turned the plunger 90° for the second dose) and the antihistamine. It was much cheaper, and it was pretty easy—- just pull off the needle cap, stick your thigh to the hilt, and press the plunger.
Despite the relative ease of autoinjectors like EpiPen, I was pretty upset when Ana-kit was discontinued and I had to start carrying EpiPens. That’s why I always get the generic 2-pack prescribed and keep it in a ziplock bag with a couple Benadryls.
They probably weren’t allowed around you if you were eating peanut butter. Did you eat a lot of peanut butter growing up?
I grew up in communist bloc, finely oppressed by russians with their military bases all around, ready for nuclear WWIII that never came.
Peanut butter isn't something I ever saw before being adult and well into 90s, it simply wasn't a thing, I guess evil capitalist invention with the only goal to subvert our fine communistic paradise, like ie Star wars movies. Raw peanuts were frequent though, I guess one of very few things that actually made it through very badly functioning central planning and wasnt stolen by aparatchicks and collaborants for themselves. Never ever met a single kid having any sort of peanut allergy during growing up, never knew its a thing. I recall one or two with asthma, hay fever and thats it. But same could be said about any form of mental diseases/issues for whatever reason, anxiety, adhd variants and so on... either ignored, undiagnosed or really on much lower levels, dont know.
Kid misbehaving? A fine smack or some other physical punishment settled things at least in primary school. Then things started to change a bit till they overcorrected these days.
This is such a strange modern take. Parents didn't "allow" their children around others. Unless they were royalty most wouldn't spend more than 40 minutes a day with their parents. We don't see the that concept coming to American until the late 80s, early 90s in affluent neighborhoods to socially distinguish themselves from "ghetto" families
[dead]
To what do you attribute the apparent increase?
Almost literally, no one in Indian schools have a peanut allergy.
Lumped in with the SIDS batch.
Allergy in children in hand versus machine dishwashing
https://pubmed.ncbi.nlm.nih.gov/25713281/
tl/dr: In families who use hand dishwashing, allergic diseases in children are less common than in children from families who use machine dishwashing. We speculate that a less-efficient dishwashing method may induce tolerance via increased microbial exposure.
Correlation/causation?
I think I’ve heard three different comedians make a joke out of this saying if we just gave all the kids peanut butter the problem would eventually take care of itself.
But it’s not actually a joke.
I guess that’s the weird thing about jokes. They often get very close to something that is true but they come at it sideways, so it’s funny instead.
One of the difficult parts of this advice for me was that my daughter wasn't eating food at the time when we were supposed to introduce it. In those cases, you're supposed to add peanut butter to the milk, which we did a few times. We let it slip for a few weeks, because it was one more thing in a pile of many things. We got her back eating peanut butter once she started eating food, but it was too late. She had developed a peanut allergy.
After going through the desensitization program at an allergist, we're on a maintenance routine of two peanuts a day. It's like pulling teeth to get her to eat them. She hates peanut M&Ms, hates salted peanuts, hates honey rusted peanuts, hates plain peanuts, hates chocolate covered peanuts, hates peanut butter cookies, and will only eat six Bamba sticks if we spend 30 minutes making a game out of it.
I highly recommend being very rigorous about giving them the peanut exposure every single day. It would have saved us a lot of time.
My daughter was in the original study, back in 2012. It was very interesting, but a lot of work. She was randomised into the early introduction arm, so she had a whole load of allergens that she had to eat regularly. iirc, as well as peanuts it was egg, sesame, white fish, milk and wheat. There were several trips to London for lots of tests.
By the time my second daughter was born in 2014 they'd told us some of the preliminary results, so we followed the guidance even though she wasn't in the trial – no peanuts in the house so she couldn't get any on her skin until she was able to eat solids, and then peanut butter was her first solid food and we fed it to her throughout her infancy. I don't know if that's the cause, but she still eats it by the bucketload.
How long did you delay for? It's not like there's some tiny window of opportunity before 10 months or whatever. Consider that the Spanish conquistadors who literally never saw a peanut as a child and tried their first peanuts as adults all survived long enough to make peanuts a globally accepted food. You can't blame yourself. To think that somehow not delivering peanut exposure was a sure cause of the allergy is nonsense.
Prior to modern hygiene, most humans probably had worms, as well as having to constantly battle other pathogens. Immune systems had no time for peanuts in the face of these other threats.
I don't remember exactly, but I suspect that the introduction and then disappearance was worse than not introducing it at all until we could do it consistently. It was probably something like six weeks between giving up on peanut butter in her milk and starting her on solid food.
> I suspect that the introduction and then disappearance was worse than not introducing it at all
I'm not convinced that we understand the human immune system quite that well.
Speaking as someone with three kids and (sadly) quite the handful of apparently inherited medical conditions in the family.
As it stands we have:
* coeliac (me, plus two of the three kids... and the third kid already tested positive on the coeliac genetic test)
* childhood asthma (me, plus one of the three kids)
* severe allergies (me, plus two of the three kids)
No nut allergies, so far. We're still counting :/
I'm not aware of a recommendation to give peanuts/other possible allergens that regularly, at least I'm certain that's not a thing where I live. The change was that peanuts before were avoided completely for years, and now they are added when it fits, like a peanut butter toast once in a while. Outside of the desensitization therapy you go through now, you do not give like two peanuts every day or put it in milk regularly. You just test for allergic reaction early and then stop thinking about it, that's the change.
So you did nothing wrong. The six week pause was completely meaningless.
Sounds more likely that she was just bound to get the allergy anyway. Giving the children exposure to the allergens early decreases the risk. It does not eliminate it 100%. Doubt not feeding her everyday peanuts was what made the difference.
Yeah if the kid's immune system was so sensitive to peanuts they might have gone from 95% chance to 90% chance with exposure or something like that. I expect the population level risk would be heavily skewed by those with low sensitivity (who might benefit a lot from exposure) and those who wouldn't have developed an allergy either way.
You don't need to let them 'eat' it. just introduce it to her body by putting a tiny amount in her mouth is enough to trigger the immune system.
There is immune tissue in the gut as well (see Peyer's patches et al.), so oral exposure only may not actually be enough.
>>I highly recommend being very rigorous about giving them the peanut exposure every single day
I honestly can't tell if this entire post is some kind of parody or what. That cannot be real - I don't know anyone or have ever heard of anyone basically force feeding their child peanuts to maybe avoid peanut allergy later in life. It sounds insane, just like the presumption that because you missed some imaginary time window in their development your daughter has developed peanut allergy. That cannot possibly be real.
Which part sounds insane?
Living with a deadly allergy for the rest of one's life is no fun at all. A large part of social life is eating together in various locations where the allergen may not be so strictly controlled. Either one faces an easy death weekly or one opts out of many social activities. It is awful, and not wanting that for one's child seems natural to me, not insane.
I deliberately expose my child to a lot of things I want them to have in their life: climbing, swimming, the game of go, Unix command-lines, Newton's laws, musical instruments, etc. Doesn't seem odd to add peanuts to the list. (Well, for us it is logistically inconvenient because another member of the household has a deadly allergy, but if it weren't for that it would be sensiblre.) Not insane either.
Is it the idea of exposure leading to lower prevalence that sounds insane? That's been relatively strongly established in randomised trials. Not insane.
Some parents seem to perpetually live on the verge of an existential crisis for fear that they might do or not do something that will forever scar or harm their child.
Sleep deprivation and unending anxiety do weird things to people. Some people seem to genuinely go a bit crazy once they have kids.
I think you are reading the parent comment wrong. They are highly recommending it because their child DID get a peanut allergy not because they MAY develop one later.
Nobody is reading the parent comment that way. You might be reading it wrong, but in a different way that misses the specific window fantasy.
No I did read it that way. I understand perfectly that their child developed a peanut allergy, and I'm very sympathetic - but the assumption that if only they fed her peanut brittle within some specific time period would have avoided it is just pure fantasy, or wishful thinking at best. They are of course free to assume so and I am well familiar with the feeling of "if only I did things differently" that every parent gets.
It's not a silver bullet, but there are many studies (including the ones referenced in the article) that show that introducing peanuts in a consistent early childhood window reduces the likelihood of later developing a peanut allergy. I don't think this is "pure fantasy."
Different, but sort of related...
Our daughter recently developed EILO. It sounds silly and totally illogical, but more than once, I've found myself wondering if there is anything we could have done to have helped her avoid it.
So yes, that feeling just comes with being a parent, I guess.
I organized a toddler group. Trust me, that absolutely can be real. One mother in particular always seemed to opt for exactly the bad option, from sitting up the baby way before it was ready (-> long term increase of likelihood of back problems) to exposing it to sun without suncreme by choice "for tolerance" (-> long term high increase of likelihood of skin cancer) to force-feeding solid food way before the baby could cope (-> nothing long-term, I'm just surprised it survived that). Bad instincts plus outdated or wrong knowledge. Thinking there is some regular peanut diet to follow would have fit right in, as would have completely avoiding peanuts.
I'm very curious what the method they used to attribute this to the advice? I ask as I swear I saw something around that timeline talking about "trans fats" and how they were a possible culprit in a ton of nutrition related woes. Notably, in 2015 was when it was removed from the "safe" list and it was on the way out during this time.
It sucks, as I can't find whatever paper I thought I read that implicated trans fats in allergies. Searching "trans fats allergies" shows several. I'm assuming it was one of the main results.
So my question is largely, why would it be more likely that the advice is why allergies reduced? Seems if there was evidence that trans fats were leading to increased allergies, that removing them would be by far the bigger driver?
Israel introduces peanuts early on in life and they have very low peanut allergies.
Few studies on it. https://pubmed.ncbi.nlm.nih.gov/19000582/
I have a vague but possibly false recollection that someone noticed that mothers who ate peanuts while breastfeeding had lower incidence of peanut allergies in the child. And they started pulling on that thread.
Glad to hear grandmas approach of "just give them a bit of everything" has now been proven correct :)
The problem is there are always exceptions, like honey for infants.
Or alcohol. Or boiled poppy to make them calmer.
Everyone has been fed small amounts of alcohol as a child though - it's in everything from fruit to bread.
That was/is a thing? Opiods? Actual amount in there is the other question but still
https://blog.sciencemuseum.org.uk/the-addictive-history-of-m...
Yes, _huge_ thing. Most ‘patent medicines’ were either booze or laudanum.
My grandma talked about it. With "I haven't done that" comment, but yeah, apparently it was something she got advised to do in all seriousness.
There's a certain wealthy area near me where restaurants ask first if you have allergies, and ice cream shops ask if dairy is ok. My wife and I always joke, "we're in that part of town."
> where restaurants ask first if you have allergies
Pretty common expectation in many countries. I was surprised to see this is not normally a thing in the US, given how we're led to believe how much you guys love to sue each other.
I have never been asked such a thing, in the US or elsewhere. It would be on the customer to inform the staff of any allergies.
> Pretty common expectation in many countries.
I've been to 40+ countries and not once have I been asked about allergies at a restaurant or food shop (i.e., ice cream, etc.)
I'd say I am asked this question at about half of the restaurants I eat at in the US (in the northeast.)
Not sure, but there's a funny saga about sesame in bread cause of lawsuits here
Is the joke that they are respectful with regards to allergies? Or am I reading a bit much of an attitude into your comment? Because it comes off as rude and tone deaf.
With a child that has PA on anaphylaxis-level and has had such an reaction a couple of times, and she has thusly built up a fear and anxiety, not being able to casually just let her attend b-day parties etc etc etc, I can assure you it's not a joke to us.
And no, we are not overly clean, in fact love going outdoors into the woods and getting dirt under our fingernails. Nor did we hold her off peanuts when small, her first reaction came when she just had learned to walk at about 10 months and ate a tiny piece found on the floor. And we as parents work very hard on trying to have a casual attitude towards life and work on her anxiety, and not let the PA define who she is or does. But then something like last week happens - those who make the food for school messed up her box of food and she ate mashed pea pattys and got really, really bad, worst in years. Boom, all her confidence in school down the drain.
It's heartbreaking, really. To find her have all that fear and pain, and we can only do so much to help her with that. And it's heartbreaking to see it being a joke to some. When I see such attitudes, I try to think that it comes from someone who is living a happy-path life, and well, good for you.
Thanks for coming to my TED talk, and smash that bell button.
The joke is that there is a group of people who are fashionably allergic to various things. Remember 15 years ago when suddenly everyone was celiac? I'm all for cutting junk carbs out of the diet, but this was something people were just "discovering" for themselves. It always seems to be some sort of health-nut crowd which is often far more vocal than those actually suffering.
I never heard people self-diagnosing as coeliac, that would be ridiculous - diagnosis for that requires both a positive blood test for the antibodies and then a gastroscopy/biopsy. The 'trendy' crowd tended to be self-professed as "gluten intolerant" or "gluten sensitive".
Given the rarity of celiac disease, the amount of shelf space dedicated to gluten free products suggest a lot of self diagnosis.
Well, I'm grateful for that. As is my wife, who is Celiac.
And some gluten free things are pretty good (I'll generally take a gf brownie, cornbread, or carrot cake over the alternative).
But since when is accessibility a bad thing? Are people really troubled by there being options?
Same with that everybody is suddenly neurodivergent now.
Shouldn't you as a customer be in control of which food is served to you? If you have an allergy, ask for the component list and then decide what to order.
Like in rich neighborhoods people cannot talk and should be babysat by the serving staff.
You're making this out to be a problem when there is none... but I get it, hating on 'rich neighborhoods' is a easy target.
Basically, what is wrong with asking if someone has allergies? If you don't like it, don't go.
That I can, and that I do :)
However, I object to the notion that people being considerate of people with allergies, or people with allergies, is weird and ok to be made fun of.
Is it really healthy for people to fret over minor allergies they probably don't even have, and pass that anxiety onto their kids? And can't this behaviour even cause allergies, and cause people with real allergies to be taken less seriously.
You know who "those people" are, don't pretend you don't. You just don't want them mocked either because they make you feel comfy or because you get mistaken for them.
18 years of restaurants checking in, and we did everything we could to be respectful of allergies but the onus was always on the customer to tell us. Now when I go to places to eat they ask me, which is a different approach and like comment OP one I find much more often in more expensive restaurants. I think you're reading way too deep into this comment. What do you think, chat?
I just think it seems hugely inefficient for a waiter to ask hordes of non-allergic people for allergies for every one allergic person that could just as well announce theirs.
[dead]
So what? It's a trivial kindness and it takes a few seconds. What else were they going to do with those seconds? It's different if it's fast food, where maximizing people I/O does matter. But that's not the case here.
Further if the restaurant asks ahead of time, that's a signifier that they take it seriously. If you have to tell them, it's much more likely you will encounter cavalier treatment of cross-contamination and such. For some people, that really can be life or death.
It's a signifier that you're in that part of town. Life or death for some people.
The joke is on parents who have made it a class signifier that they can afford to be more involved in their childs' life. This then extends to the business they frequent.
What is heartbreaking for me is all the wasted effort and pressure parents are putting on their children for little tangible gain.
Who are you making the joke to?
These jokes are always of the form "we are in a superior group who know things those outside the group don't".
In this case: "allergies and intolerances are made up stuff for weaklings, haha".
I don't think there's a joke here, or at least I'm not reading it this way. If anything it's not about "weaklings" but about being in an area where people are more likely to sue a business if they aren't warned beforehand - a street vendor elsewhere will not ask because their risk profile doesn't include being sued by someone who bought ice cream from an ice cream truck.
Like if you walk into a store and they offer you coffee or even a glass of prosecco, I would also say to my wife "oh we're in that kind of store now" because you know you're about to be ripped off in some way. Not that other stores are for weaklings.
In this case, the joke is that someone with a dairy allergy could somehow buy and eat ice-cream without knowing it contained dairy.
It's just a completely ridiculous thing to check. Like warning that the boiled peanuts contain peanuts, or that a pencil sharpener should not have fingers inserted into it.
Possibly something more like "people who have non-imaginary dairy allergies aren't likely to go into an ice cream shop, and even if they do, they obviously won't order without emphasizing their dairy allergy".
It's kind of similar to the Whole Wheat Bakery asking you whether you're OK with gluten. If you aren't, you made a big mistake walking in.
I think it is ignorance.
Affluent areas in general have more variety. The ice cream shop may be a place where you can get all kinds of ingredients that you wouldn't find at other places. This is 100% true for "fine dining" and it's one reason why they ask.
They will also have substitutes for an allergy to make the experience just as pleasant, thus they ask.
Yeah like I've never been to a Starbucks
> I think it is ignorance.
> Affluent areas in general have more variety.
No, it's definitely a difference in cultural norms, not something driven by the store inventory.
> They will also have substitutes for an allergy to make the experience just as pleasant
This is not the case for dairy in an ice cream shop, or for wheat in a "whole wheat bakery".
Many ice cream shops literally have dairy free ice cream. Ben and jerrys famously has dairy free ice cream.
Do they ask every single customer if dairy is okay? Because that was the original point... not that the ice cream shop also offered non-dairy versions.
They don't. Well maybe in that part of town they do, haven't tried.
Yeah I don't even know why this is a thing, could be lawsuits for all I know, but it's not about the menu.
The considerate part of town?
> ask if dairy is ok
> "we're in that part of town."
Given that lactose intolerance impacts non-caucasians much more, this reeks of racism
Lactose intolerance Prevalence by Race:
African Americans: 75-95%
Asian Americans: 70-90%
Native Americans: 70-80%
Hispanic Americans: 50-65%
Caucasians: 15-25%
Lactose intolerance isn't remotely the same thing as an allergy.
It's even funnier cause that area is white. But even if it weren't, idc
It is very much a race and culture thing across Midwest. Dairy and cows have a connection to being "white Americans" all the way back to Old West and cowboys. A huge number of them are lactose intolerant (native American genes) but insist and even put family pressure on anyone who tries to move away from cheese butter milk dairy etc ..
They should say the previous recommendation caused millions and millions of kids with peanut allergies that could have been avoided.
In my experience there’s still much more to this. I’m sure it helps at population level like the article describes but it’s not foolproof. For our first we were feeding nuts early and still developed an allergy to all nuts. Our second didn’t get nuts until much later and he’s fine. There’s more to the story than timing, notably my first has eczema and asthma too so there’s that atopic march.
Did you medicate for reflux?
Allergy rate decreases with birth order. Of course, that's at the population level and probably not strong enough effect to notice if you only poll a dozen parents you know.
As the 3rd child that had more allergies than I can really make sense of, I'm curious on this. Any recommended studies to read up on this?
Something about this just reminds me of when I did a literature review in my anatomy class to address the question: "Is running bad for your knees?"
I had to decide which of two sets of peer-reviewed publications that contradict each other was least guilty using the data to support the conclusion rather than letting the data speak for itself and making an honest conclusion.
Compared to PhDs, MDs hate designing an experiment and would rather just extrapolate a different conclusion from the same longitudinal study by cherry-picking a different set of variables. The only articles I bother reading from the NEJM anymore are case studies because they're the only publications that consist of mostly-original information.
Running experiments is really really hard due to regulations. Difficult to blame doctors for that.
https://www.astralcodexten.com/p/book-review-from-oversight-...
Thanks for this link, very interesting.
Notable citation:
> A system that began as a noble defense of the vulnerable is now an ignoble defense of the powerful.
The fun part is realizing that any and all exercise comes with risks, and running probably is bad for your knees in the long term - but maybe the long term health benefits to the rest of your body of running outweigh the risk of damage to your knees.
Your personal health profile or family history may also put you at higher risk for cartilage degeneration from running, which would shift the balance in the other direction.
Blanket statements about medical outcomes like that are useful for medical practice in general, but can be misleading for individuals making health decisions if they ignore other relevant factors. There's also plenty of doctors who will not take those other relevant factors into account and just go by whatever the last training or research they were exposed to (which, incidentally, is also why big pharma companies invest in salespeople to target doctor offices - because it works).
Why “probably bad” though. If you plug that exact query into Google, numerous recent studies will tell you that it is probably good for long term knee health too (stress builds resilience unless prevented recovery time). Which studies are probably right?
"Probably" is being used here because the body doesn't have a really good way to rebuild cartilage, especially as you age, due to lack of blood flow into places like the patella. Knee and hip replacements are on the rise (https://oip.com/the-lowdown-on-the-uprise-of-knee-and-hip-re...) as well in Boomers, indicating that age related degeneration (with or wihtout a history of running) is fairly universal and expected.
There's absolutely some perfect middle ground of "just enough" running that will strengthen, but not deteriorate too quickly, your knees - but again where that point is will vary by individual. It also may not be something that can be determined except in hindsight, partly because medical professionals generally don't start monitoring cartilage until the person is reporting pain or mobility issues (or a known condition they're checking for symptoms of).
Point being that statistically there are useful trends in aggregate data that can be observed, but, paradoxically, those trends don't necessarily translate to good general medical guidance. One counterexample where those trends do translate would be something like that peanut allergy study from 2015 that was linked on HN recently about introducing allergens earlier and frequently to babies, resulting in fewer teen/adult allergies.
> because the body doesn't have a really good way to rebuild cartilage
Okay, but I’m still noting that if you google this exact claim, numerous recent studies found that running is found to build cartilage, contrary to past assumptions
Right but is simply walking better for you overall? You still get exercise without the forceful impact on knees.
You can get almost all of the health benefits of running from walking (weight loss, cardiovascular performance, etc.), it just takes much longer. Also, running is better for the bones (but worse for soft tissues).
Cycling/spin can yield more aerobic intensity with less stress on knees, you can't even get to Zone 2 with walking unless you're very overweight. Of course there are bodyweight options like aerobics, shadow boxing, jumping rope.
Skipping rope would be my favorite were it not for the fact that you need a lot of headspace for the rope. This makes it unviable except outdoors or at a gym/facility.
Not what this conversation is about but anyone running and worried about their knees should consider doing a little cycling. Dooesn't have to be fast or high resistance, but it does supposedly "massage" your joints without impact and help cartilage recovery. I definitely noticed a difference with myself and about 2 dozen clients with knee issues from running intensely (military, athletes etc)
As somebody who uses both, I personally think a rowing machine is better for general cardiovascular exercise than a bicycle. The work is better spread over your whole body instead of mostly the legs. You can get cheap ones with magnetic resistance that work fine for exercise purposes (the main advantage of more expensive rowing machines is more accurate simulation of rowing in a real boat).
Cycling is however a lot more interesting if you have somewhere good to ride.
Or row on water. Before dawn, songbirds and pelicans. Rowing machines are the fallback, not the first choice, just like running machines.
If I had to load a boat onto my roof rack and drive to the nearest river at 5am every time I wanted to exercise, I'd do it once a week at the absolute maximum. I don't think it's a reasonable way to exercise for most people.
This is true. It's also why I moved to live near water. But, a lot of people in the rowing club do exactly this: row 1-2 times at most a week.
Rowing machines are fine. I'm not sure why they have a he-man scale going up to 11 when the on-water experience is mostly below 4, but I guess people need goals. Bad back goals.
Better yet swimming... but both biking and swimming require a thing, where running does not.
Skipping rope is also a great option. Cardio is up there with running and it's not as hard on the knees. We usually start every session at my muay thai gym with it, and whenever I travel I'll just throw a cheap rope in my bag.
I do love just getting out and running though!
The question seems really poorly formed! Like there’s never going to be a binary answer to a question like that. The answer is always going to be “it depends” on for example the volume, your physical attributes, recovery, genetics, age etc
I'll give you the principled answer and the cop-out answer. Here's the principled answer:
That was just the catchy title, similar to peer-reviewed literature reviews published on nih.gov: not necessarily the creme-de-la-creme, just good enough to pass peer review. I real question is whether concern for cartilage erosion is well-founded, and whether or not it outweighs the scientific consensus that running improves bone density of the tibia and fibula. Again, literature had strong evidence for the latter while the former was still a major controversy in kinesiology.
I didn't even touch cardiovascular health, because to be fair we live in a world with bicycles and affordable YMCA lap pools.
Here's the cop-out answer: It's a literature review- the very requirements are merely one step removed from those blog articles Harvard Medicine publishes for mass-consumption. I followed instructions, one of which was to adhere to a maximum of 2 1/2 pages, and I got a Northwestern 95 on the assignment.
For people that like nuance and details yes. But the point is, most people don't want that, they want to be told what to do or make a binary decision: Good or bad.
FWIW I tell people that running is bad for your knees, but relative to other exercises! If someone wants to only run, then go do it... better than nothing.
My take is: running is sometimes bad for your knees, but being sedentary is pretty much always bad for just about everything else.
We're you allowed to not reject all the null hypotheses and thus come to no conclusion?
For example say 3 papers are rediculous, could you say "they are all rediculous, there is nothing learned, we know nothing new from them"
There was a period of a few decades (I guess still ongoing, really) where parents sheltered their kids from everything. Playing in the dirt, peanuts, other allergens. It seems like all it's done is make people more vulnerable as adults. People assume babies are super fragile and delicate, and in many ways they are, but they also bounce back quickly.
Maybe part of it is a consequence of the risks of honey, which can actually spawn camp infants with botulism. But it seems that fear spread to everything.
Not to confuse things: There quite simply is a long list of things that can kill an infant and we get increasingly better evidence for what's on there and what is not. Avoiding death at all cost is ludicrous, but for a child born in the 1950s in high income countries the mortality rate was ~5%. 1 in 20 kids dead before the age of 5. For contrast, now it's closer to 1 in 300. That's not a coincidence but a lot of compounding things we understand better today.
Are there missteps? Certainly. Figuring out what is effective, what has bad secondary effects (fragility, allergies etc) and what is simply wrong is an ongoing effort and that's great, but less dying is a pretty nice baseline and progress on that front is inarguable.
To be a bit morbid, one could also explain OPs observation that "people are more fragile" by the lower child mortality by the hypothesis that these more fragile people wouldn't have made it through infancy before.
I don't particularly believe this, but it fits Occam's razor, so it seems to deserve some examination.
Occam's razor is basically (paraphrased) "given two explanations where all else is equal, the one with the fewest added assumptions is most likely true." Based on that Occam's razor is already out the window because all else isn't equal.
Also this "more fragile people" argument assumes the "fragility" is both inherent and of a lifelong kind. This ignores that most causes of infant mortility are external, and that for many of those being exposed to them results in a lifelong increased mortality risk. Excessive hygiene leading to more allergies is a direct example of this.
> but it fits Occam's razor
How? You can use that to decide between two (or more) explanations, but you only presented one.
It was implicit, at least to my eye, that other explanation which was being offered a counterpoint was the grandfather comment.
For clarity, I will include both here:
The two explanations for increased adult fragility are:
forgotoldacc> Parents shelter their children too much and have created adults that have additional allergies as a result of lack of childhood exposure
rocqua> Increased sheltering of children has allowed more of the fragile ones to survive to adulthood, increasing the number of fragile adults we observe today.
What’s this increase in fragile adults you’re talking about? Are you sure it’s a real thing? Are you aware how staggeringly high rates of institutionalisation were in most western countries in the early to mid 20th century? And then there were the adults who were considered ‘sickly’. Like, _fainting_ wasn’t considered dramatically abnormal behaviour until quite recently.
A lot of people who today would be considered to have a condition which is entirely treatable by doing (a), taking (b), not doing/avoiding (c), etc, would, a century ago, have just been kind of deemed broken. Coeliac disease is a particularly obvious example; it was known that there was _something_ wrong with coeliacs, but they were generally just filed under the 'sickly' label, lived badly and died young.
(And it generally just gets worse the further you go back; in many parts of the world vitamin deficiency diseases were just _normal_ til the 20th century, say).
That makes a huge amount of assumptions but also wouldn’t fit their experience. If it was this then it would add a few percent of the population being “more fragile” and I’d wager they see it as a broader trend.
Intuitively, this does make a lot of sense, and it's easy to make an argument that if civilizational progress continues, in the far future people will in general have very weak bodies, simply because reliance on medical equipment won't be an evolutionary disadvantage.
I think most of the change in death rate is improved medicine (and maybe wealth too – plenty of people in the US in the 50s were very poor by modern standards) rather than parents knowing about many potentially harms. (Maybe I’m wrong? Happy to be corrected here)
This is the conclusion I lean towards, but anecdotally one of my grandparents knew something like 3 or 4 kids who died before the age of 15, all in preventable accidents. Disease got at least a few more. It’s possibly just a coincidence but hearing the stories of how inattentive people could be to their children back then, I’ve always suspected current helicopter parenting norms must have accounted for at least some of the decline.
There’s been a similar shift with people letting their dogs roam free. When I was a kid I remember hearing stories about a dog getting run over by a car every year. I rarely hear these stories anymore because people usually keep their dogs supervised or in a fenced yard. I don’t have any hard data, but I suspect there’s something to these cultural shifts.
Vaccinations and better antibiotics reduced death rates a lot, but in 1950 accidents were still 30% of the death rate for children, killing 5 times as many children than die today for all causes.
The death rate for children aged 5-14 is is 14.7 per 100,000, i.e. 0.0147%. That's basically zero and five times that much is still basically zero. By comparison, the death rate for the 35-44 age group was 237.3 per 100,000.
Also, the most common type of accidental death is car accidents. So is even that difference from kids not getting to play outside anymore, or is it radial tires and crumple zones?
Do you have a source for that?
> for a child born in the 1950s in high income countries the mortality rate was ~5%. 1 in 20 kids dead before the age of 5.
Essentially all of this was infant mortality, i.e. kids who died before the age of 1, and that in turn was more related to things like sanitation and vaccines and pre-natal screening.
Large scale antibiotic production wasn't until the 40s in the US, maybe a while to spread to all other wealthy countries. Was that the main factor?
Quick look into it, in the 50s:
- Before the age of 1, top cause of death were defects (prematurity/immaturity, birth injuries) and congenital deformations.
- Age 1-4 it was accidents (e.g., drownings, burns, traffic) followed by influenza/pneumonia.
But if you got into an accident, wouldn't antibiotics help with the injury, surgery etc.? A bad burn could get infected etc. And possibly similar for some birth injuries and birth defects, and bacterial pneumonia for sure.
I wish society at large could be on par with this nuanced and rational opinion. I miss when science was celebrated.
Rational and science might be pretty far apart. Flying a key in a thunderstorm for example isn't the most rational decision. Neither scraping open your family's arms and applying cowpox pus.
Pretty irrational, but definitely celebrated.. eventually
Risky and irrational are different in my mind.
If the best available means to perform an experiment carries some risk, it could still be entirely rational to do it rather than forfeit the knowledge gained from the experiment.
Rational/risky experiments are illegal currently.
For example take the famous mask debate. It could easily be solved by having volunteers willing to stand in a room with people with covid at various distance, each using randomized masks/no mask. There would be plenty of volunteers for such a study but there's no way it would be approved.
The FDA doesn't count lives lost due to inaction and slow approval of new drugs and treatments. As Munger always said "show me the incentive and I'll show you the outcome." By any rational calculus, that one Thalidomide win by the FDA has caused incalculable death, pain and suffering by pushing out the timeline on not only recently discovered cures but all those built on top.
Imagine for example the number of lives saved if GLP-1 was purchasable over the counter in the 1990s when it was first discovered.
There seem to be some quite powerful forces acting in the opposite direction - social media maximising engagement by pushing divisive stuff and politicians trying to demonize the other team. Not quite sure what the answer is. I feel there should be some tech type solution. At least LLMs at the moment by taking in the whole internet seem fairly neutral although Musk seems to be trying to develop right wing versions.
> I miss when science was celebrated.
One could argue that science being celebrated too much leads to this type of present-day outcome. Science can tell you how to do something, but not why, or even what we should do to begin with.
It’s not just save as many lives as possible at all costs, saving 20 kids but 2 will develop debilitating peanut allergies isn’t worth it. Progress must be done slowly ensuring no harm is done along the way.
Science failed here.
What on earth are you saying? It's better to kill 20 children than to risk that 2 of them develop peanut allergies? I don't see how this can even begin to be an arguable position to take. And that's ignoring the fact that it isn't even a correct assertion in this case.
They’re not mutually exclusive options, we can save the 20 kids safely while having a mindset that values doing no harm.
Telling anxious parents to have their kids avoid peanuts caused harm that wouldn’t have happened otherwise. I guess it’s valuable to better understand allergies, but learning at others’ expense isn’t worth it.
> It’s not just save as many lives as possible at all costs, saving 20 kids but 2 will develop debilitating peanut allergies isn’t worth it.
Your math isn’t checking out here.
I clearly misspoke and people are misunderstanding my point, which is only that “hurting people is worth it” is a horrible argument and shouldn’t be a valuable thing, we can and should save the 20 kids without causing harm to the 2
doing nothing is better than something if that something might hurt people without understanding how and why
People are misunderstanding your point because you are doing a terrible job of explaining it.
What specifically do you disagree with? I’ve explained it three different times now and can’t delete my original comment so please let me know
This research shows physicians harmed kids recommending they avoid allergens like peanuts, is that something we should ignore because all the benefits of science are “worth it”?
Science is amazing not because it’s always right, but because it (should) strive to always do better next time
All you're fucking doing is saying "Don't save a million people of 1 person is going to be harmed" OR the utterly trite point of "wouldn't it be great if everything was magical and no one was harmed by anything ever".
What you’re describing is called utilitarian ethics, the exact tradeoff is called the trolly problem. Ethics is much more complicated than a single comment thread
“it’s worth it” is a horrible argument when people’s health is on the line.
Yeah, we should just round up all those peasants with peanut allergies and shoot them!
So you avoid things like electricity and the internet, because they've caused children's deaths too?
I’d prefer to live in a world where the same technology developed in such a way that they didn’t have to die, yes.
"What doesn't kill you makes you stronger" makes for a fun little statement. It's not actual natural law though, right? I feel like it's fairly well documented that good hygiene is a win for humanity as a whole, so I have some skepticism for generally saying "well let the kids eat dirt". We did that for centuries already!
The thing I'm a bit curious about is how the research on peanut allergies leading to the sort of uhhh... cynic's common sense take ("expose em early and they'll be fine") is something that we only got to in 2015. Various allergies are a decently big thing in many parts of the world, and it feels almost anticlimactic that the dumb guy take here just applied, and we didn't get to that.
Maybe someone has some more details about any basis for the original guidelines
Speaking as someone who has had a lot of experience talking with doctors in poorly-understood clinical situations over the years, most doctors display a need to establish informational authority over their patients.
So if the "dumb guy" take is "just expose the kids, they'll adapt to it", in the absence of hard evidence to the contrary (and maybe even with it) the average doctor is going to _reflexively_ take the opposite position, because that shows that you (or the conventional wisdom) were wrong.
There are exceptions, and they are either the ones that just don't care at all, or they're the best docs you'll ever find.
A justification I read once is that the human immune system evolved to deal with a certain amount of pathogens. If you don’t have enough exposure to pathogens, the immune system still tries to do its job, but winds up attacking non-pathogens.
> Various allergies are a decently big thing in many parts of the world
Maybe we live in bubbles.
I am from Asia. I have only seen people need to be taken to emergency hospital in American tv shows for any allergies. Here I've never seen it in my whole life and didn't even know allergy can be this dangerous. We don't have peanut allergy too. First time even I saw it in TV, I was very confused.
Allergies do exists here, but "not to the extent" like what I've seen in American TV shows or heard online.
Only thing I remember is people need to take medicine for to allergy from venomous caterpillar hairs, they mistakenly touched those. And stung by honey bees, wasp etc.
I think this is selection bias. I know plenty of people in Asia who have plenty allergies to some degree or another (selection bias on my side as well)
Hell, most of hayfever season in Tokyo is a bunch of people with allergies!
I think you should remember that American TV shows will use certain kinds of extreme scenarios to make a story. Lots of people who are allergic to things in a fairly benign way.
And also just more generally, I think Americans will be more likely to identify that _they have a shrimp allergy_ when every time they eat shrimp they feel bad. But I know plenty of adults who just go through life and be like "I guess I feel sick every time I eat this" and not be willing to use the word "allergy".
It makes for good TV. I think only a couple hundred Americans die a year from anaphylaxis. And many of those are from medication allergies.
Or maybe the prevalence of peanut allergies is really low.
A quick google search says Asians populations have more allergies to buckwheat, royal jelly, and edible bird nests from swiftlets. Shellfish is still one of the highest allergies anywhere.
Those are not normal food where I'm. Never eaten any of those. Basically I've never heard people getting allergy from foods that we eat here.
Same in a decent chunk of Europe too. Allergies exist, but are rare and more of the type where you're not quite sure you believe the person telling you they're allergic because it hadn't even occurred to you there can be an allergy for that. Like tomatoes, peppers, raw carrots.
The UK seems to be a bit of an exception. And it shows, the only two countries I've been asked if there are any allergies by waiters as a standard are the US and the UK.
If it makes you feel better I’m nearly 50 and I have never in my life heard of people needing to take allergy medication for mistakenly touching caterpillar hairs.
That one incident was serious, the person slept over a caterpillar getting stung all over body. Here all caterpillars have venom in hair. Personally I've touched many times by mistake, but didn't have to take medicine, the itchiness & swelling goes away within a hour.
>only got to in 2015
I think a lot of the delay is it took a while for people to realise there was a problem. The perhaps excessive hygiene thing didn't really get going till the 1960s and so you didn't really see the rise in allergies till a couple of decades after, then maybe scientists started figuring it like in the 90s and then it takes a while to get proven enough to recommend to parents?
What doesn't kill you postpones the inevitable. Sometimes it makes you stronger, often it makes you weaker. E.g. if your arms get amputated you're extremely unlikely to break your bench press personal best afterwards.
> "What doesn't kill you makes you stronger" makes for a fun little statement. It's not actual natural law though, right?
I'm pretty sure it is.
https://en.wikipedia.org/wiki/Immunological_memory
https://en.wikipedia.org/wiki/Supercompensation
Sorry I might have expressed myself badly, I get it works sometimes but it's not a hard law for "everything", even if... maybe it's a good default? maybe?
Not true generally. For example, catching Measles can wipe out your immune system thereby making you more likely to get sick. Other pathogens can also work this waym
No, it is not in any way a universal principle. The counterexample is Lead. A little lead in the diet does not make you stronger.
More generally regarding poisons, see https://en.wikipedia.org/wiki/Mithridatism . TLDR: YMMV.
"Mithridatism is not effective against all types of poison. Immunity is generally only possible with biologically complex types which the immune system can respond to. Depending on the toxin, the practice can lead to the lethal accumulation of a poison in the body. Results depend on how each poison is processed by the body."
"A minor exception is cyanide, which can be metabolized by the liver. The enzyme rhodanese converts the cyanide into the much less toxic thiocyanate.[12] This process allows humans to ingest small amounts of cyanide in food like apple seeds and survive small amounts of cyanide gas from fires and cigarettes. However, one cannot effectively condition the liver against cyanide, unlike alcohol. Relatively larger amounts of cyanide are still highly lethal because, while the body can produce more rhodanese, the process also requires large amounts of sulfur-containing substrates."
Our immune, metabolic, and other systems are built to be adaptable, and some things are easy to adapt to, but other things are difficult or impossible for them to adapt to.
While that deals with deliberate poisoning, when it comes to environmental contaminants such as lead and other heavy metals, or PM10s from vehicle exhausts, the other by-products of coal power stations and wood fires etc. I suspect that long-term exposure to these is not something where "you can build a tolerance" is a useful framing at all. Even if you technically do, it's irrelevant to the harm caused over time to whole populations.
Only a sith deals in absolutes.
Nobody is suggesting you go and add some heavy metals to your corn flakes (except you).
Well they are, if they're suggesting that "what doesn't kill you makes you stronger" is anything beyond a catchy saying.
> (except you)
The post that I am responding to does in fact deal in absolutes by asserting that "What doesn't kill you makes you stronger" is a natural law. Please don't troll by attributing that to me.
My more detailed take on this is here: https://news.ycombinator.com/item?id=45653240
It is in response to someone else who is dealing in absolutes. It seems pretty common, actually. Must be a lot of Sith around today.
There are tons of counterexamples. Chronic traumatic encephalopathy. Limb amputation. Displaced bone fractures that are never set. Crohn's Disease. Being in a coma for six months and losing all of your muscle mass. Third degree burns over 90% of your body. Plenty of things that don't kill you also don't make you stronger.
also:
https://en.wikipedia.org/wiki/Hormesis
The funny thing about trying to apply this logic in reality is that it often breaks down in ways that can be really, really bad.
I've brought up this example many times before, but Measles is a great example. Measles resets your immune system and breaks immunological memory for anywhere up to three years after having recovered from it. But now we have a bunch of people that assume any diseases can simply be dealt with in a natural way by your immune system thanks to the logic above, and well, the consequences of that are becoming clear.
> "What doesn't kill you makes you stronger" makes for a fun little statement.
It comes from a philosopher, talking about something that is completely not related to health-care, and ironically a strong criticism claiming that people that say things like that are stupid by one of the people most vilified in history by being misunderstood when claiming that things are stupid.
> "well let the kids eat dirt"
I always think about how animals eat - basically their food is never clean and always mixed with dirt. Evolution dealt with this problem since forever.
Plenty of stuff is poison to animals as well as humans! Lots of animals get sicknesses and like parasites from everything they eat.
Like with humans, though, animals have immune systems which help. This is the trouble with food hygiene arguments: you can eat "dangerous" food and 99% of the time be fine. But it's still good for people to not roll the dice on this stuff, even with a 1% hit rate. We eat food 3 times a day, so that's potentially 9 very adverse events per year!
"Yeah I get food poisoning once every month or two" is a thing that some people live through, and I do not believe they enjoy it. I have not have food poisoning for a very long time, and appreciate it a lot.
And one of the ways evolution dealt with this problem is evolving intelligence the can then tell you to improve hygiene practices to reduce the "natural" death rate
>Evolution dealt with this problem since forever.
For humans, that solution may have been 9-month gestation periods and two-decade fertility windows. A solution, to be sure, but not very desirable.
My dog will eat literal street crap at the first opportunity. She’ll also just throw it up on the carpet 2 hours later if she’s not feeling it. Not sure that’s a really an improvement.
And most of them die young.
But mostly not because of what they have eaten.
Citation needed.
Most animals die. Most babies die, too, until medicine and hygiene.
You have to balance the future immune system with current dysentery.
Yes, evolution kills the weak. I don't think you're saying "let them die"?
I have a great example of this. For our first kid, we had created a Sterile Field in our kitchen for pacifiers, baby bottles, etc. The sanctity of the Sterile Field was never violated. We would wash things by hand and then BOIL them and place them into the Sterile Field. This kid is allergic to tree nuts and a few other things.
For baby number 2, soap and water is enough. There's no time for Sterile Field nonsense. This kid isn't allergic to anything.
There was a local mom who had 4 thriving kids. When their baby dropped the pacifier in the dirt, it just got brushed off and handed back to the baby. I don't think those kids had any allergies.
For what it’s worth I was raised like kid 2 and have a bunch of annoying allergies. It’s far too messy to look at individual cases.
Same. I grew up on a farm and was constantly outside and around dogs and horses. I need allergy shots as an adult.
I've not seen a lot of research about how allergies develop as you get older.
For me, as a kid: very, very allergic to cats, kinda allergic to many food items and a little to horse hair (only noticable when shedding in the spring)
As a young adult: Only 2-3 food allergies remain, cats still strong, hayfever starts.
Then I took some shots against the hayfever for 2-3 years, and the cat thing has mostly improved and the hayfever is basically gone. So only 2-3 food items remain.
As an adult I developed an allergic contact dermatitis reaction to some sulfates (sodium lauryl sulfate definitely, sodium laureth sulfate definitely, and something in raw onion juice) after a bad burn on one of my fingers. Probably due to exposure while it was healing, since it's in a lot of soaps like Dawn have one or more of the two. Self-testing to find a soap that didn't blister my hand and then to narrow down which ingredients caused the reaction was a long & unpleasant process. So it's definitely possible to develop new allergies as an adult, as well as to lose existing allergies.
[dead]
The thing is, the sterile field is actually very important... for the first 3 or so months though. The immune system isn't developed enough yet and many medicines cause more harm at such a young age.
However this doesn't need to continue very long until basic cleanliness and medicine can be used effectively without harm.
I wonder if smearing a bit of probiotics on the pacifier could work even better than dropping it in the dirt?
Probiotics is basically a marketing term, and scientifically meaningless. So: no.
Citation? I see studies testing the impact of various alleged-probiotic bacteria fairly frequently. What am I missing?
without knowing anything, that sounds more dangerous for an infant to me
It seems like all it's done is make people more vulnerable as adults.
In 2000, the American Academy of Pediatrics recommended not allowing your kids peanuts until they were 3 years old. It was just parents following doctor's (bad) advice.
Not to confuse: peanuts cannot directly be eaten because of risk of choke, as infants cannot chew them. The advice is to add as ingredient, as e.g. peanut butter.
Unfortunately everyone will ignore this comment and continue to respond as if peanuts were advised against because of allergy risk.
They were advised against because of the allergy risk, not because of choking hazard. Are you a parent? No shit you don't give hard nuts to a baby with no teeth.
A timely reminder that although doctors aspire to follow science, and many doctors are scientists, and most doctors advocate evidence-based medicine, the practice of medicine is not a wholly scientific field, and particularly the big associations like the AAP are vulnerable to groupthink just like any big org.
Also, science is persistently incomplete, and actually making decisions (or advice) requires making assumptions (often, neutral ones, but that can turn out to be quite wrong) about what is in the unfilled gaps. The advice to avoid peanuts was because it was clear that severe peanut allergies existed, it was clear that they affected a small fraction of children, and it was clear than when they affected very young children, those children weren't able to let people know what was going on as well as older children and adults to enable timely intervention.
There wasn't much information one way or the other on what avoidance did as far impacting development of allergies, and with the evidence available, delaying exposure seemed prudent.
> and many doctors are scientists
Is this true? What percentage of doctors are scientists?
"spawn camp infants with botulism" is not a phrase I expected to read on HN today, but I'm all for it...
I’d argue that the fear you speak of spread because it was profitable. I hit the 90’s in my mid-teens and boy howdy did it seem like every news outlet, especially the local ones, had their sites set on making us terrified to eat or drink things we previously consumed without much thought. Fear gets viewers, which is how revenue is generated, so there’s an arguable conflict of interest there.
The real problem is some of those claims and reports were true, but we were so inundated with the rhetoric that everything was going to kill us that many of us sort of lapsed into apathy about it. Stepping back, the food industry in the US clearly does not have consumer health at heart and we struggle to find healthy options that avoid heavy processing or synthetic fillers. Those parents who sheltered their babies back then may have been on to something when it came to stuff we consume and we should have been on the path to demand better from our food sources had more of us been more diligent with our grocery choices (myself included, at the time), but instead we ended up with bread that lasts unnaturally long and has an allowable amount of sawdust as an ingredient.
Sheltering kids from lead paint flakes is certainly beneficial
There's a pretty clear nuance in my post where I was addressing things the immune system can handle. Not poison that accumulates in the body.
Also not true for things the immune system can handle. Many pathogens damage your immune system, not help it, even if you recover.
Lead paint flakes are not an allergen; they're a toxin. Your nick checks out.
I'll wager that more children and adults have been killed by assault rifles and oversized vehicles over the past few decades than have died from a peanut allergy.
That kind of assumes they are sheltering kids, but to be honest peanuts aren't really that common a food, certainly not in foods you would commonly give a four month year old child.
In America and much of Asia, peanuts are incredibly common. This is like an Indian person saying beef isn't a common food. In your country, sure. The rest of the world? No.
Infants in SE Asia are probably getting near daily exposure to peanuts.
[According to google] My country has a per capita peanut consumption of 1.4 kg per person per year vs america's 2 kg. So not that different.
I still maintain its mostly in foods people don't generally give to toddlers. People may give a PB & J to a 5 year old, but they don't generally feed that to a 6 month toddler. Not because they are protecting them from peanuts but because generally people dont give sandwiches to toddlers.
Peanut butter?
Do people give their 6 month olds peanut butter? I'd worry it would be a choking hazard for a child that young.
Really depends where you are. Here in Germany you probably would have Nutella rather than peanut butter.
Peanut Butter is not a very common food, except in the USA.
A big reason that the effect of avoidance was hypothesized and the studied and nailed down is because (even when avoidance became common in the US), peanut-contain snacks were (presumably, still are, it wasn't that long ago) a very common food for very young kids in Israel.
The popularity of the Bamba peanut snack has a huge impact on peanut allergies - plausibly a 10x reduction when comparing similar populations.
https://www.jacionline.org/article/S0091-6749(08)01698-9/ful...
Yes, there are some counterexamples. Bamba (peanut-butter-flavored puffed maize) in Israel is one, worth studying as it is commonly given to very young kids.
But generally speaking, the USA is an outlier on the prevalence of Peanut Butter specifically, and to a lesser extent peanuts in general.
it's common in Australia
UK too. And roasted peanuts.
The Hygiene Hypothesis has been around for a long time.
It will be interesting to see what happens with allergies for those who were born in the 2020-2023 timeframe.
Exposure to microbes and potential allergens relevant to the hygiene hypothesis doesn't seem likely to have changed very much - it's not like people started keeping their babies in sterile bubbles. While lots of wishful thinkers jumped on the concept in recent years, the hygiene hypothesis doesn't apply to disease-causing pathogens like COVID or the flu. But yes, will be something to pay attention to, considering the massive volume of COVID infections and COVID's negative effects on the immune system.
I grew up in a smoking house. We didn't have any house cleaners. We wore our shoes in the house. I spent my childhood outdoors playing in the dirt. When we were thirsty we drank garden hose water or went inside for some Kool-Aid.
No allergies.
Most people don’t have allergies, so as anecdotal evidence, this is, y’know, beyond weak.
Meanwhile, my buddy who grew up with both parents smoking in their house and their car now has asthma. Funny old world innit
Never drank kool-aid, didn't wear shoes inside. No allergies.
Must've been the garden hose water.
You drank the Kool-Aid, I get it
You might be confusing bouncing back with survivor bias. A lot of them used to not bounce back. They had funerals.
> where parents sheltered their kids from everything. It seems like all it's done is make people more vulnerable as adults.
I don't agree that this is "all" that it has done.
There are many cases where reducing exposure as much as possible is the correct thing to do. Lead is the best-known example.
As the other reply pointed out, the second-order effect, the nuance that comes later is that sometimes this isn't the right thing to do.
But it would be basically incorrect to reduce it to blanket, binary, "all good" vs "all bad" black-or-white conclusions, just because the there is a smaller course correction when it's found out to be not entirely good. Concluding that "all it's done is cause problems" is a knee-jerk reaction.
> the risks of honey, which can actually spawn camp infants with botulism
I hadn't heard of this. Very intriguing that only camp infants would be affected.
Most likely you know already, and if that's the case just ignore this comment please. Spawn camp in this context is referred to gaming terminology where it indicates an enemy that camps/waits for for a long time and kills you as soon as you are put in the battlefield, which is your spawn point, hence spawn camping
Thanks, I had not understood previously, and was parsing the sentence incorrectly. I have no prior knowledge of the dangers of honey.
> There was a period of a few decades (I guess still ongoing, really) where parents sheltered their kids from everything.
The hygiene hypothesis is not impossible, but evidence for and against it is questionable. But anyway, for peanuts it's not the hygiene.
It's a much more complex mechanism that retrains your immune system from using the non-specific rapid-response allergic reaction to the T-cell-mediated response.
The same method can be used to desensitize yourself to poison oak or ivy. You need to add small amounts of them into your food, and eventually you stop having any reaction to urushiol contact with the skin.
> There was a period of a few decades (I guess still ongoing, really) where parents sheltered their kids from everything
Not just parents sheltering kids. Take a look at this (in)famous tweet https://x.com/d_spiegel/status/1271696043739172864 from *June 2020* ...
"[eg] women aged 30–34, around 1 in 70,000 died from Covid over peak 9 weeks of epidemic. Over 80% pre-existing medical conditions, so if healthy less than 1 in 350,000, 1/4 of normal accidental risk"
The biggest reason I took covid19 seriously was because many countries in separate parts of the world took drastic measures, unlike nut allergy which is the poster child for first world problems.
> many countries in separate parts of the world took drastic measures
Putting China to one side, broadly speaking weren't the most stringent and prolonged restrictions mostly in wealthier, highly-developed countries?
Poor countries have lots of people who can’t afford masks and shelter at home without risking starvation.
Developing countries also have significantly younger populations, who are at much lower risk.
"Older adults are at highest risk of getting very sick from COVID-19"[0]
[0] https://www.cdc.gov/covid/risk-factors/index.html
It’s obvious from the response this garnered that a lot of users haven’t gotten over this period of their lives ending.
I don't understand why a quotation - a straightforward summary of factual information about the virus and its low risk to a specific group, written by a professional statistician and University of Cambridge professor - is still considered contentious or triggering to some people, even five years later.
The government responses to all that were not super informed on the whole.
Aside from the skin lotion thing[1] that got popular recently, what is the state of the art in 2025 for allergy prevention? It feels like there is a lot of common ignorance in this space but literature is full of better practice.
[1] https://www.nationaljewish.org/clinical-trials/seal-study-st...
A relative has tried acupuncture therapy for their kid, and says it works wonders! Never heard of it, you would have sworn it was crank magnetism when you read up on it; but they swear up and down about it for their kid, and I've personally witnessed the kid being introduced to food items that they were previously severely allergic to - with very minor and easy-to-mitigate issues.
This world makes little sense, but I guess I'm here for it!
https://pubmed.ncbi.nlm.nih.gov/24881629/
I don’t recall the exact details but our peds encouraged us to feed our kid Bambas.
It was based on a study done in Israel that found Israeli kids were less likely to develop peanut allergies.
>A decade after a landmark study proved that feeding peanut products to young babies could prevent development of life-threatening allergies, new research finds the change has made a big difference in the real world.
I am sorry, but am I going crazy?
We have been giving infants small amounts of peanut butter, egg etc... for decades where I live. But also let them play outside, get dirty put stuff in their mouths to train the immune system.
This is common knowledge to me.
Sometimes things are common knowledge, but don't necessarily have longitudinal studies to back them up. I think a significant number of people have thought it to be common knowledge, but now there are large studies backing this up as well.
Here is another study, as early as 2008 that shows similar results:
Objective: We sought to determine the prevalence of PA among Israeli and UK Jewish children and evaluate the relationship of PA to infant and maternal peanut consumption
https://pubmed.ncbi.nlm.nih.gov/19000582/
By that logic, children who got anaphylaxis during a study should later develop resistance to allergies.
Nutritional science have unfortunately been pretty bad at the science part for a rather long time.
There's a dark pattern hiding in the modern era where we assume hard evidence to exist where it doesn't, a projection of CAD engineering onto idle theory crafting and opinion.
I followed you until that last bit...
For any parents wary of trying to think up a way to implement this yourselves: don't. Someone already neatly packaged it up and removed the thinking from the process. (protip: feed it to your baby in a hospital parking lot)
https://readysetfood.com/collections/oatmeal
I wonder why the old advice was being given if it was so wrong? If nobody understood what to do, shouldn't there have been no advice instead of something harmful?
You seem to be suggesting that doctors should not suggest any health precautions until controlled experiments have found them effective. That is the position taken by the highly-cited paper "Parachute use to prevent death and major trauma related to gravitational challenge: systematic review of randomised controlled trials", which you must read immediately, because in a peculiar way it is a paper about you: https://pmc.ncbi.nlm.nih.gov/articles/PMC300808/
You don't need a controlled experiment if you have a good enough understanding of the mechanism, such as with parachutes. But since they apparently had no idea how peanut allergies worked nor had any adequate studies, they should have just shrugged their shoulders when asked for advice.
Even with parachutes, you could do a study (not a RCT) by looking at historical cases of people falling with and without parachutes. The effect would be so strong that you wouldn't need those clever statistical tricks to tease it out.
Lots of things kill infants that harm children, so keeping them away from things that harm some children probably seemed correct. The mechanism for allergy development wasn’t well known and it seems reasonably to avoid it in case it was genetic or something and would cause a hard to treat allergic reaction in the infant.
If people are developing allergies to food, isn’t a logical first step to not expose babies to the allergens? It seems logical. It turns out to be exactly backwards.
It would seem logical, until you learn what allergies are. They are the body's immune system overreacting to something that would normally be harmless, and acting as if it's an invading pathogen. Once you learn that, then realizing "hey, expose the body to this thing early on, and the body's immune system will treat it as normal" is a logical step.
If this theory (that early exposure teaches the immune system not to overreact) is right, then another logical consequence would be that kids who play outside in their early years would have fewer pollen allergies than kids who mostly play indoors and are exposed to far less pollen than the outdoors-playing kids. I don't know where to look for studies to prove or disprove that thesis; anyone have any pointers?
https://www.science.org/content/article/great-outdoors-good-...
Well, I mean, did you know that skin exposure can sensitize and oral exposure builds tolerance? I certainly didn’t. That’s a subtlety of the exposure game that I did not know.
E.g. from age 27 weeks my daughter has played in a little herb garden full of mud and grass I built for her. She grabs and eats leaves from the herb plants (the basil is entirely denuded so that’s a complete loss). At first I just wanted her to play in the garden out of the same naïve exposure to tolerance model. I never would have considered that skin exposure is different from oral exposure. As it so happened she ate the plant leaves and it doesn’t matter either way since this part of immunity (to microbes here) doesn’t work in the same way as peanuts anyway.
There is a joke that the book "Immune System 101" is 1000 pages long. Meaning the immune system is one of the most complicated systems in biology, simple logic arguments like yours above rarely apply, everything needs to be tested to be sure.
Bad advice that has a very long return on investment is quite sticky.
For instance the "cry it out method" did massive amounts of psychological damage to more than one generation, but it seemed to work in the short term as the babies eventually learned to "self-soothe".
Even now I still see parents and grandparents suggesting it in parenting groups; and taking extreme umbrage at the idea that it might have damaged them/their children.
And the variations on "a little spanking", "spare the rod", "dad would take us out behind the woodshed"...
Careful studies have shown that violence used with children percolates back out of them, in rather rapid fashion. Something like a great majority of them go on to use violence to interact with others in the next two weeks.
So, yes, as it turns out: a little spanking did hurt... specifically, it hurt innocent bystander kids.
Cry it out is bad advice? A relative of mine is a psychologist phd and she does it with her baby for sleep training saying self-soothing is fine.
It's too big of a topic for a HN comment but do a google or LLM search and see. One widely-accepted aspect is that a child can not "self soothe" until 5-7 years. It's not developmentally possible, and using that language is a bit of a PR move to gloss over what is actually happening.
https://www.google.com/search?q=infant+co-regulation+vs+self...
https://en.wikipedia.org/wiki/Emotional_self-regulation
https://en.wikipedia.org/wiki/Co-regulation_(communication)
https://en.wikipedia.org/wiki/Attachment_theory
People did understand what to do, it just turns out their understanding was wrong. We might still be wrong though, one study isn't definitive proof of anything. We have to make decisions with the knowledge we have at the time, and it's normal for those decisions to look dumb in hindsight.
The 2000 guidance was based on expert opinion because there were no studies. Leap was published in 2015 and it gave the first level 1 data on peanuts.
Anaphylactic shock is scary and peanut fear was a big deal in the late 1990s but actual risk of harm was very low. The guidance was more about the psychosocial burden placed on parents when there was no guidance. Anxious parents have been studied, that mechanism is reasonably well understood and that harm can be quantified.
Hindsight is 20/20. The fact is that thousands of children were dying and public health officials were set to task to identify interventions that help.
They know that skin and mucosa sensitization can occur in response to allergens.
A reasonable hypothesis is that there’s some boot-up process with the immune system that needs to occur before anything happens. The kids are dying today. “Avoid the thing that can cause sensitization” is a conservative position.
It is unusual that it should have been opposite and that oral exposure induces tolerance. It’s the fog of war.
The standard conservative intervention has helped in the past: I’m pretty sure seatbelts didn’t have strong mortality data before they were implemented. If it had turned out that more people were killed by seatbelts that trapped them in vehicles it would make for a similar story. I think they also got rid of all blood from donors who were men who have sex with men during the initial stages of the HIV pandemic (no evidence at the time).
Edit for response to comment below since rate-limited:
Wait, I thought it was on the order of ~150/year people dying from food anaphylaxis though I didn’t research that strongly. It was off my head. If you’re right, the conservative advice seems definitely far too much of an intervention and I agree entirely.
"The fact is that thousands of children were dying"
What? That's insane, 4-5 kids were dying a year. The whole thing was mass hysteria, that then started to create the problem when there had been none.
You do not know what you don't know.
[dupe] https://news.ycombinator.com/item?id=45647133
[dead]
[dead]
[flagged]
you should really educate yourself on lactose intolerance, or really how you view medical conditions in general. being a "bitch" or not has nothing to do with whether your body produces certain enzymes.
> Just one or two nights of pain
you shouldn't be allowed near children if that's your approach
Undoing of the effects of excessive and unnecessary social guidance takes ages.
At some point through the times of civilizations, humans started having less work to do and more idle people around. The idle people started spending their time for preaching a life style other than what was evolved naturally through centuries and millennia. They redefined the meaning of health, food, comfort and happiness. The silliest thing they did was creating norms, redefining good and bad based on their perception of comfort and happiness and enforcing those norms on populations.
Human race continued to live under the clutches of perceptions from these free-thinking idle people whose mind worked detached from their bodies and thus lacked the knowledge gained from the millennia of human evolution.
The natural lifestyle for centuries and millenia was just dying.
We went from about 50% infant mortality to maybe 1 in 1000.
I think people seek out these restrictions on their own. Almost everyone I know has some sort of belief about what's healthy and what isn't.
Some people become vegetarians, some people become vegans, some people believe eating big steaks of red meat is healthy, some people avoid pork, others do not eat cooked food on some days of the week, others eat only fish on special holidays, some people tell you that yoghurt is good for your gut, others tell you to avoid dairy at all cost, some tell you to avoid carbohydrates, ....
Some of these are backed by science, some are batshit crazy, some are based on individual preferences.
I don't think this is a new phenomenon. People just love coming up with rules, and even if our society allows you to eat pretty much whatever you want, people still seek out restrictions for themselves (and their kids...)
You have left out the elephant in the room - the government controlling the food choice, healthcare, medicines, and overall a lifestyle. You don't have as much freedom as you would like to think.
What are you talking about? Food safety regulations like requiring milk to be pasteurized?
I think that's just common sense, but at least in my home of Austria you can still easily get un-pasteurized milk if you really want. I'm not sure how the "government" controls my food choices? (In some cases I would actually prefer more regulation, because some producers make some questionable choices. I would prefer to buy cured meat without nitrates, but it's quite hard to find)
Have you ever heard of obesity and variety of diseases that are mostly specific to some countries and their life style? If not you should travel to some third world countries. This is only to show you that your government is the biggest stake holder and controller of your life style, not you.
You really think a sense of embodiment can be lost voluntarily?
Yes, when the mind is over-confident of it's education and perceptions, it starts to disobey the signals from body and force the body to follow what mind says. That's when the mind loses the support of knowledge encoded in the body, the knowledge which wass collected through evolution.
The mind tries to compensate the loss with experimentation that can't undergo the same extent of evolution. Then it dictates body to follow the results of these puny and tiny experiments, and ignores the rich knowledge already encoded in the body.
>when the mind is over-confident of it's education and perceptions, it starts to disobey the signals from body and force the body to follow what mind says.
Isn't that one of the fundamental things being taught to nascent minds as a prerequisite to participating in society -- starting the earliest stages of development, at which point neither one's mind nor one's body really has much of a say in the matter?