$10 Billion Science Project to Launch
AP
filed under: Science News, World News
GENEVA (Sept. 9) – Scientists will launch an experiment in a tunnel deep beneath the French-Swiss border Wednesday, hoping to find evidence of extra dimensions, invisible “dark matter,” and an elusive particle called the “Higgs boson.”
And although leading physicists such as Stephen Hawking say the atom-smashing experiment will be absolutely safe, some skeptics fear the proton collisions could unleash microscopic black holes that would eventually doom the Earth.
The most powerful atom-smasher ever built will produce collisions of protons traveling at nearly the speed of light in the circular tunnel, giving off showers of particles that will provide more clues as to how everything in the universe is made.
In the $10 billion project — the most extensive physics experiment in history — the Large Hadron Collider will come ever closer to re-enacting the “big bang,” the theory that a colossal explosion created the cosmos.
The project, organized by the 20 member nations of the European Organization for Nuclear Research — known by its French initials CERN — has attracted researchers of 80 nationalities. Some 1,200 are from the United States, an observer country that contributed $531 million.
The collider is designed to push the proton beam close to the speed of light, moving around the 17-mile tunnel at 11,000 times a second at full power. Ramping up to full power is probably a year away.
Smaller colliders have been used for decades to study the atom. Scientists once thought protons and neutrons were the smallest components of an atom’s nucleus, but experiments have shown they were made of still smaller quarks and gluons, and that there were other forces and particles.
The CERN experiments could reveal more about “dark matter,” antimatter and possibly hidden dimensions of space and time. It could also find evidence of the hypothetical particle — the Higgs boson — which is sometimes called the “God particle.” It is believed to give mass to all other particles, and thus to matter that makes up the universe.
Here is the full article.
One of the doom-day’s scenario is that a scientific experiment goes awry and they destroy the whole universe.
This could be it.
Black holes are a myth…. there are no such things as black holes…. anyway, we will find out in the next few weeks 🙂
The final armageddon game..
The particle collisions will be started in October.
The Hawking radiation is not more than a hypothesis, yet. It’s not proven. Should it turn out that there is none, then I think it means that the mini black holes will not immediatly desintegrate but stay, and eventually grow exponentially. That is one of the concerns.
I think that what they are doing in the LHC is NOT completely identical to what happens every day in nature. If cosmic radiation hits earth, light speed particles hit particles which, compared to that speed, virtually don’t move. So, the products of such a crash will still have high velocity. In the LHC, particles will collide coming from opposite directions, which both have almost light speed (and the same speed).
I am no Einstein but that seems a BIG difference to me.
For now, I’m not going to make big plans for November and beyond.
An interdisciplinary conference was held at a university to study evidence that the Vice Chancellor existed.
The theoretical physicists pointed to a black hole in the University’s accounts.
Black holes exist.
I’m also told there is a black hole at the centre of every galaxy, around which it rotates.
So, if when this super collider bangs the particles….and the black holes are created, if for some reason the black holes start growing without control…..can we just wait until they suck up France before we pull the plug?
I found a first-hand information source, by the LHC Safety Assessment Group at CERN:
http://lsag.web.cern.ch/lsag/LSAG-Report.pdf
Go to page 7. There, they explain: “There is, however, one significant difference between cosmic-ray collisions with a body at rest and collisions at the LHC, namely that any massive new particles produced by the LHC collisions will tend to have low velocities, whereas cosmic-ray collisions would produce them with high velocities.”
The follow-up text gives the arguments why they think that these collisions present no danger. It’s a bit “heavy stuff” to read for a non-physician.
On their facts & figures page,
http://public.web.cern.ch/Public/en/LHC/Facts-en.html
they mention among other impressive numbers, that they will have 600 million proton collisions per second (and on another page I read that the effective amount of experiment time will be 10 million seconds per year). That would mean 6,000 trillions(am.) = 6*10^15 collisions per year. I mention that because with such numbers, it seems to me that even the most unlikey things can happen sooner or later – if they can happen at all, by such collisions…
Anonymous of 7:05 is making the same error I made in early days as a physics student- that of thinking that velocities add up at relativistic speeds in in the same way they do in our common day-to-day experience.
When two cars collide head on- each one going exactly 60 mph- the relative velocity of the impact is very close to (but ever so slightly less than) 120 MPH. We don’t notice this difference between the “expected” relative velocity and the actual velocity, because the difference is so minuscule. But its there.
However, at very high speeds- for instance, two objects moving towards each other each moving at 0.99c (99 percent the speed of light) relative to an observer- the difference becomes very noticable. The two particles in question would NOT be moving at 1.98 times the speed of light relative to one another. Instead, the two velocities added together would be defined according to one of the Lorentz Transformation equations. In this case, the two particles would be moving towards each other at something like 0.997 times the speed of light.
Technical explationation here.
http://en.wikipedia.org/wiki/Velocity-addition_formula
So, the idea that somehow the LHC will be producing energies velocities that exceed that which can be produced naturally is incorrect. In fact, the Earth is pelted daily by insanely high-speed particles that have energies many thousands (even millions) of times more energetic than the LHC can produce in its collisions.
http://en.wikipedia.org/wiki/Cosmic_rays
And… we’re still here.
Brad Hoehne
@Brad Hoehne, ok, but my remark was simply meaning the difference between
1. cosmic radiation, having high velocity, colliding with particles on earth which are more or less at rest,
or in contrast to that
2. particles in the LHC which move towards each other at almost light speed and collide.
Nowhere in my remark have I claimed that velocities add up at relativistic speeds. I never wrote that and I never assumed it. It was something you have “read into it.”
My comment from 10:07 has the link to the LHC Safety Assessment Group’s statement where they refer to exact that difference, just like I did. I guess you won’t think that they made that particular mistake you mention? 🙂 Neither did I.
$9 billion.
But waste of money anyway.
Anon 12:33:
Excuse me for misunderstanding you, but, to my view, you seemed to be saying that the LHC would be a different kettle of fish because BOTH particles were moving. But this assumes that there is such a thing as “moving” and “at rest” without picking a frame of reference. In fact, there is no such thing as “at rest”. We here on planet Earth are moving around the sun, around the galaxy, across the universe relative to some galaxy over on the other side of the universe. I, relative to my desk, am “at rest” (usually right after lunch in a food coma) but, relative to a galaxy on the other side of the universe, I’m moving at nearly the speed of light. (Strange, it doesn’t feel like it.)
As an analogy, consider that car makers don’t crash test their vehicles by taking two cars (or a car and a crash plate) and smashing them together- each going 25 MPH. They use one car, smash it at 50 MPH, and know the effect is exactly the same.
There is no functional difference between a cosmic ray hitting earth’s “at rest” atmosphere at, for instance, 0.999c and two particles moving at less than 0.999c, but then colliding with one another at a relative velocity 0.999c. Since velocities don’t add up in our common sense notion at relativistic speeds, you’re not dealing with higher velocities in the LHC than with Cosmic rays.
The LHC will likely move particles at about 99.999999% the speed of light. Cosmic rays can accellerate particles to 99.9999999999% the speed of light or more.
Brad Hoehne
Ahh, sorry anon 12:33.
I see what you mean. To paraphrase:
“Since the particles produced by the LHC produce particles that are slow with respect to US- in contrast to Cosmic rays, which are moving at high speed with respect to US- whatever microscopic beasties are produced will have more time to do their dirty work.”
If that’s what you mean, I see your point. However, given that the velocities involved with still be quite high with the LHC, I don’t see how this could make much of a functional difference.
Brad H.
In the end….this is just a fancy popcorn making machine.
Yes, that was my point and I think that is also one of the points of critics like Prof. Rössler. Results of particle collisions coming from cosmic radiaton, will almost always remain to have high velocities themselves, and therefore leave Earth quickly. In contrast, I am afraid that the results of the collisions (thousands of trillions per year!) in the LHC will not “move out” quickly.
So, I think that arguments about the safety of the experiment need to consider, that the results of the collisions do not have high velocities, which would make them to leave earth quickly before they can cause harm (except if the are loaded if I understand all this correctly).
As I read all the notes posted in this link….and all the links that some of you have attached, I have come to the conclusion that: man, I must be really dumb cause I don’t understand a single thing. Maybe someone will come up with the ‘Particle Collisions for Dummies’ so this whole thing can make more sense and I can contribute to the topic. In the meantime……I’ll be in the WCL site waiting to crush you all at chess 😉
The best explanation of this issue I’ve seen—balancing authoritative physics and general readability—is this April 3 item at the “BackRe(Action)” blog run by a husband-and-wife physicist couple. And their followup here.
I actually had my own private grappling with this issue last year, when I wondered whether commonly-voiced assertions about so-called “universal prior” distributions might hold in “conditional” form. An example (not the only one) of what I’m talking about is Juergen Schmidhuber’s “Speed Prior”—if you go to his funky webpages and talk slides you can find some concrete predictions about the pseudo-randomness (which means dropoff in information content) of quantum mechanics. The particular conditional assertion that concerns me is that if X and Y are physical events subject to a probability distribution of universal-prior character, and the conditional information of Y given X is far less than the information content of X or Y separately, then X and Y “should be” correlated. Then if X is something you do umpzillion times in a lab, the correlation is positive, and Y is a catastrophe, you’ve increased the risk of Y even without what you ordinarily think of as a “physical” connection—just a “digital” one. This belongs to realms of digital physics that generally aren’t yet considered testable, but I had (and still have) a new idea that among many things could potentially test Schmidhuber’s predictions. However, this survey paper by Ulvi Yurtsever convinced me that my tests could not succeed without violating special relativity (indeed, this connection comes up often in quantum computation theory). Since such a test was a pre-requisite for any kind of getting a handle on what “X” and “Y” could be, I was able to drop the matter in good conscience last winter.
Incidentally, my idea derives originally from considering hash-table collisions in chess engines. And this prominent digital-physics website (see also Wikipedia here) is run by Professor Edward Fredkin of the famous Fredkin Prizes in computer chess.
Finally, in my opinion the scientist whose cautions on the human side should be looked at first is Dr. Adrian Kent of Cambridge University, who hasn’t been getting the press of Ro”ssler or Wagner. Suffice it to say that I’ve actually had to look beneath the hood of this issue, and I decided that due-diligence has been done by the assessors at LHC (and also earlier RHIC on Long Island).
What a bunch of nerds!
Sigh, oh well, it is a chess blog after all…
“What a bunch of nerds!”
It takes one to know one.
I can say a little bit more about the connections and particulars.
First, as with the April 3 summary I cited, there’s a chain of hypotheticals befoe you get to the danger zone. In my case the first hurdle is, “Does any observable in Nature obey a universal-prior distribution?” The only candidates I’ve seen proposed (by Schmidhuber and others) that are rigorously describable seem all to be rendered untestable by Yurtsever’s argument.
Second, a key point here is the difference between asymptotic time complexity (ATC) and concrete time complexity (CTC). Essentially all of my field’s theory is based on ATC. A superbly optimized chess engine is an example of a program with theoretically high ATC but low enough CTC to play great chess in quick enough time. That’s why it could overcome my field’s barriers to detecting pseudorandomness with low ATC—and for certain non-cosmic applications this element is still wide-open! Yurtsever’s barrier (well, Einstein’s barrier:-) is all about CTC, however.
Third, the next hurdle after succeeding with the first looks paradoxical but struck me as doable: You’d have to show a performance difference between “pseudorandom” and “truly random” bits. But if quantum mechanical beta-decay bits B were pseudorandom, how could you ever get “truly random” bits at all? The detail is that Schmidhuber’s hypothesis also makes the degree of information content depend on the time and space scale. By taking the bitwise exclusive-OR B’ = B (+) A, where A is a series of bits sampled over long time and space intervals (a fanciful example is A = the vector of wins/losses by the home team in 100+ years of major league games—it doesn’t matter that A is biased toward home teams since you’re XOR-ing it with B), the target would be to show that B’ is “more random than” B. (But to be sure, Yurtsever’s argument says that B must already be truly-random, thus contradicting Schmidhuber’s prediction.)
Clearing those two hurdles would still only bring one to the as-yet-completely-uncharted territory of digital physics—but like I said, some other pretty smart people are trying to chart it.
The final point is that my hypothetical mechanism was (is?) orthogonal to collision-energy considerations. The fact that cosmic rays regularly strike our atmosphere with more energy than the LHC will muster, and haven’t caused a catastrophe, becomes a primary risk-scaling factor in physical models. This goes even more for intergalactic gamma-ray bursts—the vast one recorded last March 19 has evidently not caused its now-dimmer-glowing region to make any terrible phase transition.
That’s why I first felt a special responsibility to pursue this, because it was saying something new. (I did post queries in major physics blogs about how much people have considered universal-prior distributions for issues such as in this amazing 1/15/08 NY Times article—as I quipped to Scott Aaronson at a meeting last June, maybe they’re “good only as priors for universes”.) Understanding the connection described by Yurtsever removed that responsibility. Since I haven’t had any followups, it seems this post is not violating Wittgenstein’s parting admonition that whereof one cannot speak, one must remain silent.