Home › Forum Online Discussion › General › Speed of light may not be constant (article)
- This topic has 14 replies, 3 voices, and was last updated 11 years, 7 months ago by russelln.
-
AuthorPosts
-
April 28, 2013 at 10:08 pm #40585
Speed of light may not be constant, physicists say
By Jesse Emspak, LiveScience contributorThe speed of light is constant, or so textbooks say. But some scientists are exploring the possibility that this cosmic speed limit changes, a consequence of the nature of the vacuum of space.
The definition of the speed of light has some broader implications for fields such as cosmology and astronomy, which assume a stable velocity for light over time. For instance, the speed of light comes up when measuring the fine structure constant (alpha), which defines the strength of the electromagnetic force. And a varying light speed would change the strengths of molecular bonds and the density of nuclear matter itself.
A non-constant speed of light could mean that estimates of the size of the universe might be off. (Unfortunately, it won’t necessarily mean we can travel faster than light, because the effects of physics theories such as relativity are a consequence of light’s velocity).
Two papers, published in the European Physics Journal D in March, attempt to derive the speed of light from the quantum properties of space itself. Both propose somewhat different mechanisms, but the idea is that the speed of light might change as one alters assumptions about how elementary particles interact with radiation. Both treat space as something that isn’t empty, but a great big soup of virtual particles that wink in and out of existence in tiny fractions of a second.
Cosmic vacuum and light speed
The first, by lead author Marcel Urban of the Université du Paris-Sud, looks at the cosmic vacuum, which is often assumed to be empty space. The laws of quantum physics, which govern subatomic particles and all things very small, say that the vacuum of space is actually full of fundamental particles like quarks, called “virtual” particles. These matter particles, which are always paired up with their appropriate antiparticle counterpart, pop into existence and almost immediately collide. When matter and antimatter particles touch, they annihilate each other.Photons of light, as they fly through space, are captured and re-emitted by these virtual particles. Urban and his colleagues propose that the energies of these particles specifically the amount of charge they carry affect the speed of light. Since the amount of energy a particle will have at the time a photon hits it will be essentially random, the effect on how fast photons move should vary too.
As such, the amount of time the light takes to cross a given distance should vary as the square root of that distance, though the effect would be very tiny on the order of 0.05 femtoseconds for every square meter of vacuum. A femtosecond is a millionth of a billionth of a second. (The speed of light has been measured over the last century to high precision, on the order of parts per billion, so it is pretty clear that the effect has to be small.)
To find this tiny fluctuation, the researchers say, one could measure how light disperses at long distances. Some astronomical phenomena, such as gamma-ray bursts, produce pulses of radiation from far enough away that the fluctuations could be detected. The authors also propose using lasers bounced between mirrors placed about 100 yards apart, with a light beam bouncing between them multiple times, to seek those small changes.
Particle species and light speed
The second paper proposes a different mechanism but comes to the same conclusion that light speed changes. In that case, Gerd Leuchs and Luis Sánchez-Soto, from the Max Planck Institute for the Physics of Light in Erlangen, Germany, say that the number of species of elementary particle that exist in the universe may be what makes the speed of light what it is.Leuchs and Sanchez-Soto say that there should be, by their calculations, on the order of 100 “species” of particle that have charges. The current law governing particle physics, the Standard Model, identifies nine: the electron, muon, tauon, the six kinds of quark, photons and the W-boson.
The charges of all these particles are important to their model, because all of them have charges. A quantity called impedance depends on the sum of those charges. The impedance in turn depends on the permittivity of the vacuum, or how much it resists electric fields, as well as its permeability, or how well it supports magnetic fields. Light waves are made up of both an electric and magnetic wave, so changing those quantities (permittivity and permeability) will change the measured speed of light.
“We have calculated the permittivity and permeability of the vacuum as caused by those ephemeral virtual unstable elementary particles,” Soto-Sanchez wrote in an email to LiveScience. “It turns out, however, from such a simple model one can discern that those constants contain essentially equal contributions of the different types of electrically charged particle-antiparticle pairs: both, the ones known and those so far unknown to us.”
Both papers say that light interacts with virtual particle-antiparticle pairs. In Leuchs’ and Sanchez-Soto’s model, the impedance of the vacuum (which would speed up or slow down the speed of light) depends on the density of the particles. The impedance relates to the ratio of electric fields to magnetic fields in light; every light wave is made up of both kinds of field, and its measured value, along with the permittivity of space to magnetic fields, governs the speed of light.
Some scientists are a bit skeptical, though. Jay Wacker, a particle physicist at the SLAC National Accelerator Laboratory, said he wasn’t confident about the mathematical techniques used, and that it seemed in both cases the scientists weren’t applying the mathematical tools in the way that most would. “The proper way to do this is with the Feynman diagrams,” Wacker said. “It’s a very interesting question [the speed of light],” he added, but the methods used in these papers are probably not sufficient to investigate it.
The other issue is that if there really are a lot of other particles beyond what’s in the Standard Model, then this theory needs some serious revision. But so far its predictions have been borne out, notably with the discovery of the Higgs boson. This doesn’t mean there aren’t any more particles to be found but if they are out there they’re above the energies currently achievable with particle accelerators, and therefore pretty heavy, and it’s possible that their effects would have shown up elsewhere.
May 7, 2013 at 1:25 am #40586lol
May 7, 2013 at 2:29 am #40588Did you actually read the article?
Lots of areas of physics disagree with each other, e.g.
general relativity and quantum physics . . .There needs to be a larger model that incorporates it all.
It seems to me to be possible that speed of light could be
effectively constant, but varies to a small degree dependent
on the structure of space-time. This is akin to Lorentzian
transformations appearing Newtonian if objects move less
than 10% of c. Light speed being apparently constant
could be a limiting case.Besides, shine a beam of light through water, and it goes
slower than light in vacuum, so it seems reasonable that the
speed could also depend on the nature of space-time itself.In my opinion,
StevenMay 7, 2013 at 11:57 am #40590The equivalence of all inertial systems implies that speed of light is constant in all of them. The theory of relativity was confirmed by a lot of experiments on all the colliders
May 7, 2013 at 12:21 pm #40592You are missing my point.
It may very well be constant on Earth (roughly),
where space-time is (roughly) the same.
It may not be as you move elsewhere.
Unless experiments are done in other locations in
the universe, it may only be a local approximation.An inertial system is an idealized situation.
Nothing is truly an inertial system.
Even on Earth, we are in a gravity well.
There is no point in the universe “free” from gravity/acceleration.
Therefore special relativity is no more “correct” than
Newtonian physics . . . both approximations but in truth, wrong.The BIG law in physics is that new physics with different laws
can come up, and they can be true SO LONG AS in the approximation
to the approximation of the model you are already using, the
old laws and equations fall out. Just as Newtonian mechanics
is a special case of special relativity in the limit of
low velocity, special relativity can be a limiting case
of a larger model (and we know from the incompatibilities
of general relativity and quantum mechanics that general
relativity is probably not the correct larger model).S
May 7, 2013 at 11:10 pm #40594As You know, there is a constant alpha of fine structure, alpha=(e^2/hc)=1/137)where e is the electric charge of electron, h- Plank’s constant and c- the speed of Light. What will happen if с becames slightly different? I think that e and h will also change their values so that alpha remains the same 1/137. But it does mean that Bohr’s radius and Compton’s length of electron will also change their values without changing their ratio. Therefore experimental devices will never notice the change of the speed of light с since all the devices still consists of atoms. I called it as skeiling.
May 7, 2013 at 11:34 pm #40596May 13, 2013 at 9:05 pm #40598From holoscience.com
May be relevant to topic.
——————-The Electric Universe takes a simplifying leap by unifying the nuclear forces, magnetism and gravity as manifestations of a near instantaneous electrostatic force. Instead of being spooked by the concept of action-at-a-distance, like most physicists this century, the Electric Universe accepts it as an observational fact. Anyone who has tried to force two like poles of magnets together has demonstrated action-at-a-distance. Electromagnetic radiation is then simply the result of an oscillating electrostatic force.
At the level of the atom, the Electric Universe model takes a lead from the work of Ralph Sansbury, an independent New York researcher. Foremost is the simple recognition of the basic electrical nature of matter and the primacy of the electrostatic force** in matter interactions. It also rests upon the simple assumption that the proton, neutron and electron are composed of smaller charged particles, orbiting each other in a classical sense in stable, resonant orbits. That is, the energy exchanged between those sub-particles in elastic deformation during each orbit sums to zero. Being charged, the sub-particles interact via the electrostatic force. A simple calculation shows that the sub-particles that form an electron must travel at a speed far in excess of the speed of light some 2.5 million light-years per second, or from here to the far side of the Andromeda galaxy in one second! So the electrostatic force must act at a speed which is almost infinite on our scale for the electron to be stable. It is the stable orbital resonances of these sub-particles, both within and between particles that give rise to the phenomena of protons, neutrons, electrons and atoms. Other denizens of the particle zoo are merely transient resonant states of the same charged sub-particles. The so-called creation of matter from energetic photons is an illusion in which pre-existing matter is reorganized into new resonant states that give the impression that a particle has suddenly materialized. Antimatter is a misnomer since it too is formed from the same sub-particles as normal matter except that the total charge is mirrored. Matter cannot be created or annihilated.
A Conventional View of Forces in Physics
1. Nuclear forces keep the nucleons (protons and neutrons) together in the atomic nucleus. They are the dominating forces in the nucleus, but of no importance at large distances from it.
2a. Electric forces. A positive charge and negative charge attract each other, but similar charges repel. Electric forces keep the atoms together ( bind the electrons to the nucleus). They are of a certain importance in the nucleus. At large distances electric forces are usually not so important because of a screening effect. For example, a positive charge attracts negative charges to its neighborhood so that they screen off the field from the positive charge.
2b. Magnetic forces are closely related to the electric forces. Because they cannot be screened very easily, they are efficient at larger distances than electric forces. Example: the Earths magnetic field.
3. Gravitation is much weaker than electric forces and therefore of no importance in the atom. As the gravitation cannot be screened, it is the dominating force at large distances. The orbits of the planets and the motions of stars and galaxies are ruled by gravitation. H. Alfvén.
Quantum Theory
For the first time the highly successful quantum theory gains a physical explanation in terms of resonant motion of charged particles, mediated by a near-instantaneous electrostatic force. A quantum electron orbit is one in which the exchange of energy between all of the sub-particles in the nucleus of an atom and those in an orbiting electron, sum to zero over the orbit. Exchange of energy takes the form of distortion of a particle to form an electrostatic dipole or a move to a new resonant orbit.
Relativity Theory
Einsteins Special Theory was designed to define simultaneity in a universe where the fastest force or signal was restricted to the measured speed of detection of light from a distant source. With an electrostatic force of near-infinite speed acting between the sub-particles of all matter, relativity theory reduces to classical physics. This leaves open the question of what we are measuring when we determine the speed of light. The speed of light in galactic terms is exceedingly slow, requiring about 150,000 years to cross our galaxy. However, the astronomer Halton Arp has shown that the redshifts of entire galaxies are quantized which requires some form of near instantaneous, galaxy-wide communication at the sub-atomic level. There are now several reported experiments that demonstrate faster than light effects. With the Special Theory gone, and the universe in communication with its parts effectively in real-time, there can be no time travel and space and time are independent. Common sense has always suggested that this was so. Einsteins General Theory was devised to explain gravity. It attempts to discard the observed action-at-a-distance of gravity by proposing a counter-intuitive warping of space in the presence of massive objects. This unnecessary complication of space is then added to the current metaphysical concepts of what constitutes the mass of an object. But space must also warp at near infinite speed to produce the observed planetary orbits. Common sense, observation, and parsimony of hypotheses all suggest that the electrostatic model of gravity (see below) is superior. There is now experimental evidence from gravity measurements at the time of a total solar eclipse that supports the Electric Universe model and discounts the General Relativity model.
E = mc2
Einsteins famous mathematical expression E=mc2, equating energy and mass is known by almost everyone. However, most textbooks go on to use the word matter in place of mass. But nowhere has it been shown that mass and matter are interchangeable. In fact, we are entirely ignorant of what constitutes the mass of an object. So it is inadmissible to imply that energy and matter are interchangeable. The ultimate expression of this idea led to the nonsense of the big bang. It seems simpler and more sensible to suggest that both nuclear and chemical energy is released or absorbed by the rearrangement of the resonant orbits of charged particles. It is then common sense to suggest that mass is the measured response of a system of charged particles to an external electrostatic force. The more massive an object, the more the electrostatic force contributes to the elastic deformation of its protons, neutrons and electrons, rather than their acceleration. This is the phenomenon seen in particle accelerators and conventionally attributed to relativistic effects. But relativity reduces to classical physics in a universe where the electrostatic force has near-infinite speed. The first question to be asked is if it is that simple, why hasnt it been thought of long ago? The answer seems to lie in the propensity for mathematical theory to supersede common sense and observation. There is also a problem of language when mathematicians attempt to provide real meaning for their symbols.
http://www.holoscience.com/wp/synopsis/synopsis-11-some-basics/
May 13, 2013 at 9:46 pm #40600PS the future, from next page “The invisible energy source in space is electrical.”
Hence chi filled, shaki full, etc.
—–
The speed of light is not a barrier. Real-time communication over galactic distances may be possible. Therefore time is universal and time travel is impossible. Anti-gravity is possible. Space has no extra dimensions in which to warp or where parallel universes may exist. There is no zero-point vacuum energy. The invisible energy source in space is electrical. Clean nuclear power is available from resonant catalytic nuclear systems. Higher energy is available from resonant catalytic chemical systems than in the usual chemical reactions. Biological enzymes are capable of utilizing resonant nuclear catalysis to transmute elements. Biological systems show evidence of communicating via resonant chemical systems, which may lend a physical explanation to the work of Rupert Sheldrake. DNA does not hold the key to life but is more like a blueprint for a set of components and tools in a factory. We may never be able to read the human genome and tell whether it represents a creature with two legs or six because the information that controls the assembly line is external to the DNA. There is more to life than chemistry.We are not hopelessly isolated in time and space on a tiny rock, orbiting an insignificant star in an insignificant galaxy. We are hopefully connected with the power and intelligence of the universe.
The future in an Electric Universe looks very exciting indeed!
May 14, 2013 at 4:50 pm #40602Of course, if you are right, no such measured difference would be observed . . .
But this is, of course, supposition.It is very well possible that alpha = 1/137 + epsilon, where epsilon
is some very tiny perturbation dependent on space-time geometry or
who knows what else.Of course, there is this “Electric Universe” model that russelln is
putting forth. Maybe you can weigh in on that.S
May 14, 2013 at 5:04 pm #40604There are lots of different models;
the only question is, is to whether or not they can
be experimentally verified.Much of physics seems to rest on the idea of having
Model A, having experimental results consists with
model A, model A is then adopted. Mathematics, laws,
and physical phenomena are described in terms of Model A.
Then unresolvable paradoxes occur within Model A.
Eventually Model A is scrapped for a slightly less
intuitive model B which seems to alleviate said paradoxes.
Model B seems to be experimentally verified.
Repeat process until Model B has paradoxes, etc.This continues so often that eventually *nothing* really
makes a lot of sense, but it is accepted as conventional
wisdom.Originally, the “spin” of an electron really was thought of
in terms of an electron spinning, until it was realized that
such spinning would be faster than light, and therefore
“not possible”. Thus spin became “an intrinsic property”,
but not actual spinning, etc.This is what I am reminded of, when I read about subatomic
or subnuclear particles orbiting at supra-light speeds.I don’t know . . . I don’t know enough about the physics
in question and subsequent experiments to know how valid
this “Electric Universe” model is. Maybe STALKER2002 can
weigh in here. He is a physicist.S
particlesMay 14, 2013 at 7:38 pm #40606Thanks Steven.
This group is described as:
“The Thunderbolts Project is an interdisciplinary collaboration of accredited scientists, independent researchers and interested individuals established in 2004. Its prime mission is to explore the Electric Universe paradigm. Historical and current discoveries in the sciences have placed a spotlight on the electromagnetic force in nature, from quantum worlds and biological systems to planetary, stellar, and galactic domains.” http://www.thunderbolts.info/wp/about/
R
May 14, 2013 at 10:29 pm #40608“It also rests upon the simple assumption that the proton, neutron and electron are composed of smaller charged particles, orbiting each other in a classical sense in stable, resonant orbits.” Ralph Sansbury, an independent New York researcher perhaps does not know that proton and neutron are composed of three QUARKS while electron is not composed of smaller charged particles according to modern particle physics…
May 14, 2013 at 10:50 pm #40610Funny . . . I wasn’t even looking for this, but this article appeared in the news I check . . . same vein . . . S
——————
Warp speed, Scotty? It may actually be possible…
By Jillian Scharr
TechNewsDailyIn the “Star Trek” TV shows and films, the U.S.S. Enterprise’s warp engine allows the ship to move faster than light, an ability that is, as Spock would say, “highly illogical.”
However, there’s a loophole in Einstein’s general theory of relativity that could allow a ship to traverse vast distances in less time than it would take light. The trick? It’s not the starship that’s moving it’s the space around it.
In fact, scientists at NASA are right now working on the first practical field test toward proving the possibility of warp drives and faster-than-light travel. Maybe the warp drive on “Star Trek” is possible after all.
According to Einstein’s theory, an object with mass cannot go as fast or faster than the speed of light. The original “Star Trek” series ignored this “universal speed limit” in favor of a ship that could zip around the galaxy in a matter of days instead of decades. They tried to explain the ship’s faster-than-light capabilities by powering the warp engine with a “matter-antimatter” engine. Antimatter was a popular field of study in the 1960s, when creator Gene Roddenberry was first writing the series. When matter and antimatter collide, their mass is converted to kinetic energy in keeping with Einstein’s mass-energy equivalence formula, E=mc2.
In other words, matter-antimatter collision is a potentially powerful source of energy and fuel, but even that wouldn’t be enough to propel a starship to faster-than-light speeds.
Nevertheless, it’s thanks to “Star Trek” that the word “warp” is now practically synonymous with faster-than-light travel.
Is warp drive possible?
Decades after the original “Star Trek” show had gone off the air, pioneering physicist and avowed Trek fan Miguel Alcubierre argued that maybe a warp drive is possible after all. It just wouldn’t work quite the way “Star Trek” thought it did.
Things with mass can’t move faster than the speed of light. But what if, instead of the ship moving through space, the space was moving around the ship?
Space doesn’t have mass. And we know that it’s flexible: space has been expanding at a measurable rate ever since the Big Bang. We know this from observing the light of distant stars over time, the wavelength of the stars’ light as it reaches Earth is lengthened in a process called “redshifting.” According to the Doppler effect, this means that the source of the wavelength is moving farther away from the observer i.e. Earth.
So we know from observing redshifted light that the fabric of space is movable. [See also: What to Wear on a 100-Year Starship Voyage]
Alcubierre used this knowledge to exploit a loophole in the “universal speed limit.” In his theory, the ship never goes faster than the speed of light instead, space in front of the ship is contracted while space behind it is expanded, allowing the ship to travel distances in less time than light would take. The ship itself remains in what Alcubierre termed a “warp bubble” and, within that bubble, never goes faster than the speed of light.
Since Alcubierre published his paper “The Warp Drive: Hyper-fast travel within general relativity” in 1994, many physicists and science fiction writers have played with his theory including “Star Trek” itself.
Alcubierre’s warp drive theory was retroactively incorporated into the “Star Trek” mythos by the 1990s TV series “Star Trek: The Next Generation.”
In a way, then, “Star Trek” created its own little grandfather paradox: Though ultimately its theory of faster-than-light travel was heavily flawed, the series established a vocabulary of light-speed travel that Alcubierre eventually formalized in his own warp drive theories.
The Alcubierre warp drive is still theoretical for now. “The truth is that the best ideas sound crazy at first. And then there comes a time when we can’t imagine a world without them.” That’s a statement from the 100 Year Starship organization, a think tank devoted to making Earth what “Star Trek” would call a “warp-capable civilization” within a century.
The first step toward a functional warp drive is to prove that a “warp bubble” is even possible, and that it can be artificially created.
That’s exactly what physicist Harold “Sonny” White and a team of researchers at NASA’s Johnson Space Center in Texas are doing right now.
NASA’s warp drive project
According to Alcubierre’s theory, one could create a warp bubble by applying negative energy, or energy created in a vacuum. This process relies on the Casimir effect, which states that a vacuum is not actually a void; instead, a vacuum is actually full of fluctuating electromagnetic waves. Distorting these waves creates negative energy, which possibly distorts space-time, creating a warp bubble.
To see if space-time distortion has occurred in a lab experiment, the researchers shine two highly targeted lasers: one through the site of the vacuum and one through regular space. The researchers will then compare the two beams, and if the wavelength of the one going through the vacuum is lengthened, i.e. redshifted, in any way, they’ll know that it passed through a warp bubble.
White and his team have been at work for a few months now, but they have yet to get a satisfactory reading. The problem is that the field of negative energy is so small, the laser so precise, that even the smallest seismic motion of the Earth can throw off the results.
When we talked to White, he was in the process of moving the test equipment to a building on the Johnson Space Center campus that was originally built for the Apollo space program. “The lab is seismically isolated, so the whole floor can be floated,” White told TechNewsDaily. “But the system hadn’t been (activated) for a while so part of the process was, we had the system inspected and tested.”
White is now working on recalibrating the laser for the new location. He wouldn’t speculate on when his team could expect conclusive data, nor how long until fully actuated warp travel might be possible, but he remains convinced that it’s only a matter of time.
“The bottom line is, nature can do it,” said White. “So the salient question is, ‘can we?'”
May 15, 2013 at 6:34 am #40612Thanks. On reflection I’m giving up particles (and their supposed speed) and going back to Wave Structure of Matter (WSM) as per the site linked below.
-
AuthorPosts
You must be logged in to reply to this topic.