(Incandescents also flicker at 50/60Hz of course, but the thermal inertia of the filament makes this a lower amplitude flicker.)
They also flicker really badly if your power is not perfect, like you have a decent sized training rig on a different circut.
Incandescents are basically little light inductors and I would imagine the luminosity curve would be sinusoidal vs whatever hell a LED driver chip puts out.
Sidenote: My partner and me both "feel" the difference of cheap LEDs. Its not something we can pinpoint down, but it got way better with hue lights.
We live with multiple dogs and iam really curious. One of the dogs that we often had around had severe epilepsy that very strong medication was needed and the dog died anyways way to early around the age of 2. She had less sizures than in her original home when she was with us which ofcourse might be unrelated to the lighting. But your thought is interesting.
Psychological or physiological unease at least, I assume this from the way rapidly flickering dying florescent lights make me feel.
Nope. Lots of people see the strobing. Its causes headaches if you focus on the lights.
https://www.hubermanlab.com/episode/red-light-to-improve-met...
The show notes have links to quite a few more papers. I have no idea if this is good science but this is not just a one off paper.
For the most part this was a very positive step. Prices for LED bulbs plunged when they went from the "premium" energy-efficient alternative to the default option. But you also get a lot of crap on the market, and stuffing LEDs in form factors designed for incandescent bulbs makes good electrical and thermal design challenging. Even for those brands that actually try
Yeah, basically what the EU did was to say: For X Watts of electricity at least X Lumen of light has to be produced. And this number was gradually increased. Since old school light bulbs are quite inefficient when it comes to producing light, they slowly had to be phased out.
Is this true? I’ve got LEDs in my house because they cost vastly less to run, and because I rarely have to replace the bulbs.
Some cheap LEDs do flicker (at 50 or 60 Hz). But that’s fairly easily solved. I don’t think I’ve noticed the flicker since some cheap bulbs I bought in 2014 or so.
Well… (Sorry, let me put my tinfoil hat on.) Yeah, well that noticed part is what is worrisome to me. I do worry that there is some effect on our brains even though we might not perceive the flicker.
As an analogy, I got into those supposedly audiophile "Class D" (or "Class T") amplifiers over a decade ago. Every day I turned on the music in my office and coded with the T-amp playing. I would have told you at the time that, indeed, it sounded amazing.
Some time later I built a tube amplifier (The Darling [2], in case anyone cares—I've since built perhaps a dozen more).
When I brought it into the office and swapped it out for the T-amp, the change was sublime but immediately noticeable. I hate to fall back on audiophile terminology but it's the best I have for the experience: I was suddenly aware of "listening fatigue" that had been a component of the T-amp. I hadn't even known it had been fatiguing until I heard the tube amp in its place for days on end.
With the loss of color fidelity and the flickering issue, I'm embarrassed to say that incandescent is starting to look good to me again.
I might, as an experiment, replace only those lights that we turn on in the evening when we are relaxing, reading.
[1] https://en.wikipedia.org/wiki/Class-T_amplifier
[2] https://www.diyaudio.com/community/threads/darling-1626-amp.... and https://imgur.com/gallery/oh-darling-tube-amplifier-Lq2Sx
LEDs are just terrible in every way except electrical consumption.
Every other low quality led I bought around that time or even later is long dead by now. I disagree
I buy the ones that are suitable for dimmable switches (even tho I don't have dimmers) because there is discernible flicker with most other LED bulbs if you for eg wave your arm through the air or made a saccade. There is a certification (i think) for LED bulbs that are closer to sunlight in their emission spectrum
At least in EU is true. Citing from Wikipedia: "The 2005 Ecodesign directive covered energy-using products (EuP), which use, generate, transfer or measure energy, including consumer goods such as boilers, water heaters, computers, televisions, and industrial products such as transformers. The implementing measures focus on those products which have a high potential for reducing greenhouse gas emissions at low cost, through reduced energy demand."
Can you even buy them without buying new old stock? In the US they're banned and there's zero production.
I recall there was a guy in the EU who tried to get around the regulations by selling "heat bulbs" that were exactly the same as traditional incandescent bulbs but marketed as a heat source, but I think he was slapped down.
If you look around a bit you can also get 60W or 100W lamps, sold as "industrial lamps" or "extreme temperature lamps", labeled as unsuitable for household use. But those are specialty lamps that you won't find in your local supermarket. Not sure if those are new old stock or imported
Otherwise, if there is a power IC present, there is flicker, though fast enough for most humans to not perceive normally (you can still check it by waving your hand in front of the light and seeing the strobed afterimage.)
That seems remarkable and almost too good to be true?
I'm not in this scene anymore for many years, but when I was, I built my one lamp from custom light engine (essentially round PCB with DC-DC current limiting schematics an LEDs) with 3x219B, Chinese body and CARCLO TIR Optics...
Good times.
Standard LEDs bulbs are bright white, almost bluish, and yes "bright but not illuminating" describes them well. I feel many modern car headlights have the same issue.
The human eye doesn't focus the blue end of the spectrum very well.
Where are you seeing these? Is this industrial/commercial suppliers?
I think the result would be much stronger if these baselines were comparable, so they show they have accounted for other variables like time of day and light history. I am also skeptical of any effect in the retina lasting 6 weeks, with no fading.
Consider that people are often exposed to much more infrared light outdoors, so "worked under a relatively dim incandescent lamp" is not a particularly novel stimulus. Imagine that any of these people spent time outdoors during the six weeks - thousands of times more infrared light there.
It's an interesting niche topic that you may want your working place to notice if you work indoors.
However, you can get LEDs that do this well. Look for one with a "CRI" of 95% or higher.
So even that assumption would require further study.
The technology basically works by continuously microwaving (think oven) a small amount of sulfur gas. The development of solid-state microwave emitters — most microwave generation is still done with vacuum tubes — might help miniaturize the devices. However, it's hard to beat the simplicity of an LED.
It's a big thing and you can buy LEDs which produce a better colour range, but they're much more expensive and not as energy efficient, because creating bold reds costs hard energy that no diode trick will ever get around that.
Have you actually read the study? It's about infrared and has nothing to do with color rendering and visible spectrum. They're vaguely speculating about some mitochondrial mechanism of action not related to vision at all.
For LED lamps, the color must be controlled at the emission source, not by filtering, i.e. by using an adequate combination of different conversion phosphors, to ensure a neutral white with a quasi-continuous spectrum, instead of a bluish white with great narrow peaks in its spectrum.
Unfortunately, the phosphors for the latter variant are much cheaper than for the former, so the lamp vendors have the incentive to make the lamps as bad as possible.
I ask only because I was retrofitting some navigation lights on a sailboat - and you can’t just upgrade the original incandescent bulbs with LEDs (or aren’t supposed to).
You are either supposed to get a special LED (backing up what you’re saying) or there are some new red/green enclosures that are differently treated / tinted to then put a “white” led into.
But I am so far from an expert on that, I may be completely misunderstanding.
So, at the same electric power consumption you have less light, or you can compensate by using a more powerful lamp, to get the same amount of light even with a filter. In both cases the energy efficiency becomes worse, i.e. the expenses for electric power are greater per output light.
On the other hand, when the manufacturer of the lamp controls inside the lamp the conversion of the light produced by the LED through fluorescence into the light that exits the lamp, there are chances to obtain a desired color and a certain shape of the emission spectrum by wasting less light than with external filters.
Filtering can correct a lamp color that is not the color that you want, but it cannot fill gaps in the emission spectrum of the lamp.
Cheap white LED lamps not only may have a too bluish color (or in some cases a too yellowish color), but their emission spectra may have gaps, so if a natural object from the environment happens to have a color that falls in a gap of the LED lamp spectrum, it will appear much darker than in daylight. This can cause orientation problems or difficulties in identifying certain things.
Where LEDs are used for signalling, not for lighting, so pure colors are desirable, much less problems exist, so e.g. the replacement in the red or yellow signal lights of cars, of the old incandescent lamps with color filters, with monochromatic LEDs without color filters, has posed no difficulties.
You may have various LED lamps, all of which appear to have the same white color, but their spectra are very different. Those with narrow spectral peaks are very bad lighting sources, while those with wide spectral peaks, achieved by using multiple kinds of conversion phosphors, are much better lighting sources.
With the best LED lamps, there is not much difference in comparison with incandescent lamps. While incandescent lamps are best from the point of view of their continuous spectrum, their yellow light strongly modifies the perceived colors. While bluish white LEDs (e.g. of 6500 K color temperature) are also bad, neutral white LEDs (e.g. 5000 K or 5500 K) provide much better color perception than incandescent lamps.
For home lighting I prefer a white that is only very slightly yellowish, i.e. 4000 K lamps, instead of LED lamps with a higher color temperature or of incandescent lamps, which are so obviously yellow that no clothes have the same color as in daylight.
The best quality of lighting can be achieved by incandescent lamps in conjunction with frequency-selective filters, which modify their spectra to resemble the spectra of a blackbody with a higher temperature, like the Sun.
Such filtered incandescent lamps were used a very long time ago, to provide lighting for color photography, movies and television (e.g. for the original white point of NTSC), but they were abandoned due to high cost and due to an exceedingly low energy efficiency.
As I have mentioned in another comment, filtered incandescent lamps might see a revival, but implemented with very different technologies than those used a century ago.
However, if your book has color illustrations, a high-quality neutral-white LED lamp is better than any unfiltered incandescent lamp.
A standard white illuminant with filtered incandescent lamps would be even better, but such lamps, as they were made a century ago, were extremely good space heaters, which may prevent their use for reading a book.
I went through university with the same lamp/bulb, so that combo doesn't create the unwanted reflections much. Also, I still print everything in color, because a good color choice still boosts understandability of the material at hand.
> If this is done with a halogen bulb, which is a type of incandescent tungsten bulb, the filament lasts for a longer period as evaporated tungsten is redeposited on the filament rather than blackening the bulb glass. Hence, using a halogen bulb at lower voltage is a realistic alternative in terms of health and energy consumption.
Unfortunately, as I understand it, the redepositing action only occurs at high temperatures.. It's a chemistry thing. I have been led to believe that running halogens at low voltages will cause the bulb glass to blacken sooner. See https://en.wikipedia.org/wiki/Halogen_lamp#Effect_of_voltage...
The incandescent lamps with tungsten filaments have a much lower temperature than the Sun, thus much more energy is radiated in infrared than needed.
There was about a year or two ago a discussion about a very interesting research paper that reported results from testing an improved kind of incandescent lamp, with energy efficiency and lifetime comparable to the LED lamps.
The high energy efficiency was achieved by enclosing the lamp in a reflecting surface, which prevented energy loss by radiation, except for a window that let light out, which was frequency-selective, so only visible light got out, while infrared stayed inside. The lamp used a carbon filament in an environment that prevented the evaporation of the filament.
With such a lamp, one can make a tradeoff between energy efficiency and the content of healthy near infrared light, by a judicious choice of the frequency cutoff for the window through which light exits the lamp.
Even with enough near-infrared light, the efficiency should be a few times higher than for classic incandescent lamps, though not as good as for LED lamps. Presumably, one could reach an efficiency similar to that of the compact fluorescent lamps (which was about half of that of LED lamps), for such an incandescent lamp that also provides near-infrared light.
Perhaps I was not clear, but the reflective surface was the interior surface, so it reflected any light, visible or infrared, back towards the emitting filament, while the front window reflected only the infrared, while transmitting the visible light.
The lamp does not overheat, because the filament is kept at a constant temperature, the same as in a classic incandescent lamp. The difference is that you need a much lower electrical current through it for maintaining the temperature, because most of the heat is not lost away, like in a classic lamp. The fact that you need a much smaller electrical current for the same temperature is the source of the greater energy efficiency.
Only if you had used the same electrical current as in a classic lamp, the lamp would have overheated and the filament destroyed, but you have no reason to do that, like you also do not want to use in a classic lamp a current higher than nominal, which would overheat and destroy it.
The article uses LED as synonym for typical LED lightning.
10 years ago you had to work to find high CRI bulbs but could still find Cree bulbs pretty easily. Now you can get high CRI bulbs at the grocery store.
High CRI bulbs generally have low or no flicker because high CRI is toward the premium end of the market.
IR emission is not a "feature", it's a bug.
If you look at energy efficiency, it totally is. But the whole point in the discussion is that IR _might_ (according to the paper) have biological relevance.
Out of the manufacturers you listed, only Philips Ultra Definition (95 CRI, R9 90) have low flicker and good R9. Unfortunately they are poorly made and I have to keep buying new packs each year but it's more cost effective than Yuji for lesser used areas.
Also the claim from TFA is that NIR component improves visual performance (and I've read elsewhere that NIR also has health benefits).
Also LED lighting can have infrared, have a significantly more smoother spectrum curve and still last +20k hours without burnout. The cheaper bulb spectra that they show is a blue led + phosphor coating, but there are infrared LEDs, UV leds, and more. You can make quite the convincing sun simulation, even better than any incandescent bulb, but there is almost no demand for UV + Infrared super full spectrum lighting unfortunately. Only movie & theater lights come close.
Do you have a link to a bulb that you can purchase meeting all these criteria? The only one I'm aware of was this obscure "StarLike" that was never actually sold in bulk. LEDs can be made good in theory sure, but in practice they are all terrible in light quality compared to a standard incandescent.
No, you can't buy them as bulbs. The closest thing is those red light therapy panels that include them.
https://store.yujiintl.com/collections/high-cri-led-bulbs/pr...
You're paying through the nose though, but it finally exists now.
Do you really think $5 AUD per month per bulb that you’re running 8 hours a day is worth it for better spectrum quality?
Lightbulbs on the other hand affect all of society, so they've got a much larger impact to the overall CO2 budget.
Additionally, the average person uses a laptop or mobile devices, all of which use less power than even a single typical incandescent bulb (and people usually have many lightbulbs).
Replacing incandescent bulbs with LEDs saves a lot of CO2 at basically zero cost, while getting rid of computers saves less CO2 for a much larger economic impact.
And even the effect described by this article has to be looked at in context, considering most of the light people experience in a day — and have experienced for the since homo sapiens existed — is natural sunlight, even in northern Europe during the winter (that's why EU law mandates windows with sunlight in every office, apartment, bedroom, etc.)
This isn’t some kind of controversial subject. Ensuring home appliances don’t overconsume energy is beneficial for everyone in society.
You don’t want to have brownouts, blackouts, or run out of heating gas/oil in the winter.
You bring up the idea of regulating computer equipment power efficiency as if it’s crazy talk but it’s a real thing in concept. Governments do offer guidance and sometimes regulate computer efficiency. They have efficiency standards (e.g. Energy Star) as well as relying on industry standards (e.g. “80 Plus”).
Take a look at your computer monitor or TV box and it probably has an energy star logo somewhere if you live in the US.
The US federal government and other state and local agencies will not buy computer products that aren’t energy star compliant, and encourages businesses and individual to follow similar standards. Other countries might regulate further than these (dis)incentives.
https://www.energy.gov/femp/purchasing-energy-efficient-comp...
And if you bring up data centers, those are considered productive industry that has its own regulations different than home regulations. Plenty of things legal in industrial series aren’t legal in your house.
Yes, there is something obviously wrong with most LED lights, but it isn't too much of short wavelength light, but on the contrary. It's the near absence of cyan light in most LEDs. Our eyes are by far the most sensitive to it, the majority of receptors in the eye are sensitive to it, and we may focus primarily on it (focus differs for different wavelengths). This is how you get the feeling of something being wrong with your vision as you for example walk into a mall, and so on.
If anything, higher temperature lights seem to make it better, not worse, but the problem will persist as long as the cyan hole stays there.
However, the experimental group (extra light sources) got rf 91 bulbs, and the control ("LED lighting") got rf 85 bulbs.
The two scales are not exactly comparable, but they both max out at 100. The only source I could find that discusses both says that > 90 CRI is "excellent" and just below that is "very good". It says > 85 rf is "very good", which tells me it's comparable to a mid-80's CRI bulb.
If I accidentally buy a mid-80 CRI bulb, I either return it to the store, or just throw it away.
So, I'd say this study's experimental setup doesn't support any useful conclusions. They showed that so-painfully-bad-California-won't-subsidize-them LEDs are worse than passable LEDs with supplementation from another light source.
The passable LEDs in the study are probably comparable to the cheap ones at our local hardware store, but worse than the ones that cost $10-20 on amazon ten years ago.
This would have been much more interesting if they'd compared high-end LEDs with and without supplementation, and found a difference. (And by "high-end", I mean "still much cheaper then the electricity they save")
Funny enough, the best evidence for this study is that they should probably move somewhere with more sunlight if they can't spell "color" right... /s
a) How do Philips Hue bulbs stack up?
b) Did Philips update them generationally and assuming they are decent now, how recently?
Ban on all fluorescent tubes (T5 and T8 lamps) from August 24, 2023
Ban on all CFL lamps from February 24, 2023
Extension of the exemption granted to HPD lamps from 3 to 5 years
Extension of the exemption for special purpose lamps from 3 to 5 years
We many times see some people reporting that they clearly see lower quality LED light flicker and is really distracting to them and even causes them headaches.
Now, I didn't see this until recently (unless in failing lights) in the right conditions. If the light is very, very dim: For instance, only 1 light on in the night, and you are in a division far away from the light so that it's extremely dim. There, I could finally really see it flicker.
I've replaced that light for a better one and the effect went away.
there may be more light (photons) but their spectrum is too limited for my eyes to see like halogen, etc.
I still only use compact florescent in my home, led is useless to me
Can you elaborate?
Leds are pushed as a solution to energy consumption by humans without paying any attention to the health effects. Hopefully it will be less than 80 years of cancers and metabolic disruption before the obvious is done.
But this time the regulation was captured pre-emptively, to the point that following best scientific advice for your health is illegal is most of the developed world.
Please cite your sources then. And no the other article you linked is not proving your claim
> However, most studies relied on satellite-images with a very low resolution (1 to 5 km, from the Defense Meteorological Program [DMSP]) and without information on color of light
> noted that data quality suffered from many limitations due to the types of satellite images used and the focus in the vast majority on visual light levels only rather than considering the circadian-relevant blue light component, among others. Future studies should consider improved satellite-based ALAN technologies with improved resolution and information on spectral bands and apply these technologies to a variety of cancer sites to yield better estimates for the potential risks between ALAN and cancer.
So nothing conclusive about LED being bad for your health (vs other types of light).
But it seems like it's too much to ask? I should just accept whatever comments I read without any critical thinking?
People really should get it and stop sharing newly published papers to the general public. The value of one single academic paper is exactly 0. Even a handful of such articles still has 0 value to the general public. This is only of interest to other academics (or labs, countries, etc.) who may have the power to reproduce it in a controlled environment.
Be very skeptical of correlations like this that have dubious or poorly understood causation. Be even more skeptical if they are about day-to-day stuff that would likely have large swaths of people able to reproduce something like it on huge scales yet they haven't. Extraordinary claims require extraordinary evidence.
Considering the percentage of live mitochondria that are exposed to external light in a human this seems like an enormous effect. The effect we'd expect from publication bias though is already pretty big. I'm going to go with the latter until we've got some replication, and a plausible mechanism (like.. why wouldn't whales be badly sick if this was a thing?).
This explains why most mitochondria are exposed to infrared light, even those deep in the body.
The article also mentions an inhibiting effect of blue and violet light upon mitochondria. For that it should be valid what you say, that this effect can happen only in the superficial layer of the body, because both skin and blood strongly absorb such light.
Despite saying the visible flux component is "small" and that the tungsten lamps "were not expected to [be used] as task lamps," Figure 6 (a) and (c) shows... desk lamps right at the work stations like task lamps! Not only is this experimentally unblinded, but the visible light immediately in front of the test subjects is noticeably brighter and warmer. The effect could simply be due to reduced eye strain.
What would James Randi do? "Extraordinary claims require extraordinary proof," and unfortunately this isn't it.
This would be more interesting if they add a visible light filter on the lamps so they only emit infrared radiation, and have an identical double-blind control with a 60 watt heater bulb so it emits no SWIR but the same radiant heat (which could confound and/or unblind).