Showing posts with label particle physics. Show all posts
Showing posts with label particle physics. Show all posts

Monday, October 24, 2011

How many neutrinos should exist ?

The SM assumes three types of neutrinos. However, these authors analyzed data to demonstrate the need of two more types [http://www.nature.com/nature/journal/v478/n7369/full/478328a.html?WT.ec_id=NATURE-20111020].

Neutrino oscillations, observed through the transmutation of neutrinos of one type into neutrinos of another type, occur if there is mixing between neutrino types and if individual neutrino types consist of a linear combination of different neutrino masses. (At present, the masses and mixings of the fundamental quarks and leptons can be measured but are not fully understood.) In the case of two-neutrino mixing — for example, mixing between the muon neutrino and the electron neutrino — the probability (P) that a muon neutrino (νμ) will oscillate into an electron neutrino (νe) is given by P(νμ right arrow νe) = sin2(2θ)sin2(1.27Δm2L/E). Here, θ, in radians, describes the mixing between the muon neutrino and electron neutrino; Δm2 is the difference of the squares of the masses of the two neutrinos in square electronvolts; L is the distance travelled by the muon neutrino in kilometres; and E is the muon-neutrino energy in gigaelectronvolts.

In general, the number of different neutrino masses equals the number of neutrino types, so that three-neutrino mixing involves three neutrino masses and two independent Δm2 values, whereas five-neutrino mixing involves five neutrino masses and four independent Δm2 values. Neutrino oscillations have been observed at a Δm2 of about 7 × 10−5 eV2 by detectors that measure the flow of neutrinos from the Sun and experiments that detect neutrinos at a long distance from nuclear reactors. The oscillations have been detected at a Δm2 of around 2 × 10−3 eV2 by detectors that measure the flow of neutrinos from the atmosphere and by experiments in which neutrinos are measured at a long distance from particle accelerators. In addition to these confirmed observations of neutrino oscillations, there is also evidence for oscillations at a Δm2 of about 1 eV2 from short-distance accelerator and reactor neutrino experiments2, 3, 4. However, it is not possible to explain this third Δm2 value with only three neutrino masses. Therefore, additional neutrino masses are required.

In their study, Kopp et al.1 tried fitting the world neutrino-oscillation data to theoretical models involving four different neutrino masses (three active neutrinos plus one sterile neutrino) and then five different neutrino masses (three active plus two sterile neutrinos; Fig. 1). They found that one sterile neutrino was insufficient to explain the world data, but two gave a satisfactory global fit. (Similar fits are discussed elsewhere5, 6, 7, 8, 9, 10.) One other feature of the authors' two-sterile-neutrino fit is that it allows for violation in leptons of the charge–parity (CP) symmetry — according to which particles and antiparticles behave like mirror images of each other — or for a difference between neutrino oscillations and antineutrino oscillations. Such CP violation might help to explain the r-process, in which heavy elements are produced through nuclear reactions involving rapid neutron capture (hence the 'r'), and the production of heavy elements in neutrino bursts from stellar explosions known as supernovae. It might also help to explain why the Universe is dominated by matter and not by an equal amount of matter and antimatter.

Tuesday, October 18, 2011

Neutrino not that fast !!

I like this blog entry by Zz. He highlighted a recent article that did not see data indicating superfast!

Monday, September 5, 2011

More analogy in Graphene

More work on the analogy between garphene and high energy physics. In this case, the focus the running coupling constants [ Nature Phys. 7, 701704 (2011). ]:
The best times in Physics are those when physicists of different expertise meet around a problem of common interest. And this is now happening in the case of graphene. From the early days of the isolation of single sheets of graphene, the relativistic nature of its charge carriers was clear1. These carriers, known as Dirac fermions, are described by equations similar to those that describe the quantum electrodynamic (QED) interactions of relativistic charged particles. A meticulous study performed by Elias and co-workers2 of the electronic structure of graphene shows that at very low energies reaching a few meV of graphene's Dirac point, where its cone-like valence and conduction bands touch, the shape of the conduction and valence bands diverge from a simple linear relation. The result implies that the analogy between graphene and high-energy physics is deeper than first expected. In particular, it implies that the electromagnetic coupling of graphene does renormalize, as occurs in quantum field theory [http://www.nature.com/nphys/journal/v7/n9/full/nphys2066.html?WT.ec_id=NPHYS-201109].

Tuesday, August 30, 2011

Revamp might may not work

The Standard Model has been very successful, but with loopholes many people try to revamp with traditional ideas. One such idea is the supersymmetry, which says that, a new symmetry might exist between bosons and fermions. To implement such symmetry, new particles have to be inserted in the architecture. Now it runs in deep trouble: no such particles have ever been detected in LHC, which is the most hopeful place to find them.

They come from the LHC Beauty (LHCb) experiment, one of the four main detectors situated around the collider ring at the European Organisation for Nuclear Research (Cern) on the Swiss-French border.

According to Dr Tara Shears of Liverpool University, a spokesman for the LHCb experiment: "It does rather put supersymmetry on the spot".

Start Quote

There's a certain amount of worry that's creeping into our discussions”

End Quote Dr Joseph Lykken Fermilab

The experiment looked at the decay of particles called "B-mesons" in hitherto unprecedented detail.

If supersymmetric particles exist, B-mesons ought to decay far more often than if they do not exist.

There also ought to be a greater difference in the way matter and antimatter versions of these particles decay.

The results had been eagerly awaited following hints from earlier results, most notably from the Tevatron particle accelerator in the US, that the decay of B-mesons was influenced by supersymmetric particles.

LHCb's more detailed analysis however has failed to find this effect. [http://www.bbc.co.uk/news/science-environment-14680570]


Thursday, August 25, 2011

What is the standard model ?


The standard model, as it is usually called, tries to encompass all possible interactions including production and annihilation of somewhat elusive elementary particles that are supposed to partly make up the universe. The following quantity explains it without words:

Wednesday, July 20, 2011

New upper limit on the electron dipole moment

This experiment sets a new limit on the possible electron dipole moment, and thus put stringent constraints on candidate extensions, such as supersymmetric models, over the Standard Model, hoping for new physics.
The electron is predicted to be slightly aspheric1, with a distortion characterized by the electric dipole moment (EDM), de. No experiment has ever detected this deviation. The standard model of particle physics predicts that de is far too small to detect2, being some eleven orders of magnitude smaller than the current experimental sensitivity. However, many extensions to the standard model naturally predict much larger values of de that should be detectable3. This makes the search for the electron EDM a powerful way to search for new physics and constrain the possible extensions. In particular, the popular idea that new supersymmetric particles may exist at masses of a few hundred GeV/c2 (where c is the speed of light) is difficult to reconcile with the absence of an electron EDM at the present limit of sensitivity2, 4. The size of the EDM is also intimately related to the question of why the Universe has so little antimatter. If the reason is that some undiscovered particle interaction5 breaks the symmetry between matter and antimatter, this should result in a measurable EDM in most models of particle physics2. Here we use cold polar molecules to measure the electron EDM at the highest level of precision reported so far, providing a constraint on any possible new interactions. We obtain de = (−2.4±5.7stat±1.5syst)×10−28ecm, where e is the charge on the electron, which sets a new upper limit of |de|<10.5×10−28ecm with 90 per cent confidence. This result, consistent with zero, indicates that the electron is spherical at this improved level of precision. Our measurement of atto-electronvolt energy shifts in a molecule probes new physics at the tera-electronvolt energy scale2.

Sunday, July 3, 2011

Space is much smooth on Planck scale

This finding might be the most stunning and most fundamentally interesting and important in the past tens of years. Many garish, dazzling yet gaudy, speculations can hardly withstand this finding.
The space is just so smooth ! [http://www.physorg.com/news/2011-06-physics-einstein.html]

Einstein’s General Theory of Relativity describes the properties of gravity and assumes that space is a smooth, continuous fabric. Yet quantum theory suggests that space should be grainy at the smallest scales, like sand on a beach.

One of the great concerns of modern physics is to marry these two concepts into a single theory of quantum gravity.

Now, Integral has placed stringent new limits on the size of these quantum ‘grains’ in space, showing them to be much smaller than some quantum gravity ideas would suggest.

According to calculations, the tiny grains would affect the way that travel through space. The grains should ‘twist’ the light rays, changing the direction in which they oscillate, a property called polarisation.

High-energy gamma rays should be twisted more than the lower energy ones, and the difference in the polarisation can be used to estimate the size of the grains.

Philippe Laurent of CEA Saclay and his collaborators used data from Integral’s IBIS instrument to search for the difference in polarisation between high- and low-energy gamma rays emitted during one of the most powerful gamma-ray bursts (GRBs) ever seen.

GRBs come from some of the most energetic explosions known in the Universe. Most are thought to occur when very massive stars collapse into neutron stars or black holes during a supernova, leading to a huge pulse of gamma rays lasting just seconds or minutes, but briefly outshining entire galaxies.

GRB 041219A took place on 19 December 2004 and was immediately recognised as being in the top 1% of GRBs for brightness. It was so bright that Integral was able to measure the polarisation of its gamma rays accurately.

Dr Laurent and colleagues searched for differences in the polarisation at different energies, but found none to the accuracy limits of the data.

Some theories suggest that the quantum nature of space should manifest itself at the ‘Planck scale’: the minuscule 10-35 of a metre, where a millimetre is 10-3 m.

However, Integral’s observations are about 10 000 times more accurate than any previous and show that any quantum graininess must be at a level of 10-48 m or smaller.

“This is a very important result in fundamental physics and will rule out some string theories and quantum loop gravity theories,” says Dr Laurent.

Integral made a similar observation in 2006, when it detected polarised emission from the Crab Nebula, the remnant of a supernova explosion just 6500 light years from Earth in our own galaxy.

This new observation is much more stringent, however, because GRB 041219A was at a distance estimated to be at least 300 million light years.

In principle, the tiny twisting effect due to the grains should have accumulated over the very large distance into a detectable signal. Because nothing was seen, the grains must be even smaller than previously suspected.

“Fundamental physics is a less obvious application for the , Integral,” notes Christoph Winkler, ESA’s Integral Project Scientist. “Nevertheless, it has allowed us to take a big step forward in investigating the nature of space itself.”

Now it’s over to the theoreticians, who must re-examine their theories in the light of this new result.

Provided by European Space Agency (news : web)

Friday, June 24, 2011

Noteworthy papers from latest issue of Science

1. Disorder-Enhanced Transport in Photonic Quasicrystals, 332:1541(2011);
Quasicrystals are aperiodic structures with rotational symmetries forbidden to conventional periodic crystals; examples of quasicrystals can be found in aluminum alloys, polymers, and even ancient Islamic art. Here, we present direct experimental observation of disorder-enhanced wave transport in quasicrystals, which contrasts directly with the characteristic suppression of transport by disorder. Our experiments are carried out in photonic quasicrystals, where we find that increasing disorder leads to enhanced expansion of the beam propagating through the medium. By further increasing the disorder, we observe that the beam progresses through a regime of diffusive-like transport until it finally transitions to Anderson localization and the suppression of transport. We study this fundamental phenomenon and elucidate its origins by relating it to the basic properties of quasicrystalline media in the presence of disorder.

2.Carbon-Based Supercapacitors Produced by Activation of Graphene, 332:1537(2011)
Supercapacitors, also called ultracapacitors or electrochemical capacitors, store electrical charge on high-surface-area conducting materials. Their widespread use is limited by their low energy storage density and relatively high effective series resistance. Using chemical activation of exfoliated graphite oxide, we synthesized a porous carbon with a Brunauer-Emmett-Teller surface area of up to 3100 square meters per gram, a high electrical conductivity, and a low oxygen and hydrogen content. This sp2-bonded carbon has a continuous three-dimensional network of highly curved, atom-thick walls that form primarily 0.6- to 5-nanometer-width pores. Two-electrode supercapacitor cells constructed with this carbon yielded high values of gravimetric capacitance and energy density with organic and ionic liquid electrolytes. The processes used to make this carbon are readily scalable to industrial levels.

3. The Limits of Ordinary Matter, 332:1513(2011)
All ordinary matter consists of protons and neutrons, collectively called nucleons, which are bound together in atomic nuclei, and electrons. The elementary constituents of protons and neutrons, the quarks, almost always remain confined inside nucleons (or any other particle made up of quarks, called hadrons). The fundamental force that binds quarks together—the strong, or “color” force—cannot be overcome unless extremely high-energy conditions are created, such as through heavy-particle collisions. Theoretical simulations based on quantum chromodynamics (QCD) predict that the transition temperature for the appearance of free quarks should occur at 2.0 × 1012 K (an energy of 175 million eV) (1, 2). Since 2000, the Relativistic Heavy Ion Collider (RHIC) at Brookhaven National Laboratory has created the necessary conditions to form quark matter in particle collision, but determining the transition temperature under these conditions is challenging. On page 1525 of this issue, Gupta et al. (3) show that the relevant temperature and energy scales can be extracted from recent experimental studies and find that the transition temperature is in remarkable agreement with theory.

4 This paper is not published in Science, but highlighted in it: Nano Lett. 11, 10.1021/nl200928k (2011).
It has long been known from ex situ studies that metal nanoparticles can catalyze reaction of oxygen with graphite surfaces and create grooves or channels. Such reactions could be used for patterning graphene sheets. Booth et al. have studied the dynamics of silver nanoparticles on suspended monolayer and bilayer graphene sheets in a transmission electron microscope. They imaged these samples at temperatures from 600 to 850 K and partial pressures of oxygen over the sample from about 30 to 100 millitorr. The nanoparticles cut channels along <100> crystallographic directions, but some fluctuations of motion normal to the channel direction were also observed. The nanoparticles did not move at a constant speed. Instead, their velocity profile was erratic, and the start-stop motion was better described by a Poisson distribution.

Monday, May 9, 2011

Electron induced rippling in graphene

Physicists are really blessed by nature in the sense that, they have all the time been offered some new objects that admit very rich phenomena to be explored. Latest examples include Graphene and Topological Insulators. Since its discovery, graphene never stops yielding surprising things for physicists. This time comes something that (again, considering Dirac physics) parallels particle physics: the strain field associated with the flexural phonon condenses in the same way as the Higgs field in the Standard Model [1]. Don't miss reading it !

[1]PRL, 106:045502(2011)

Tuesday, April 19, 2011

A big jump

I heard of this jump weeks ago. But I did not play dice on it, anyway I feel it may not hold up, but it might. What will be the result ? NO body knows for the moment. There is indeed a churr in the community, as described in this news [http://www.sciencemag.org/content/332/6027/296.full]:

Particle physicists haven't discovered anything truly surprising in 35 years, so a mere hint of something odd works them up in a hurry. So it was last week, when, aided by press reports, news spread that scientists in the United States may have spotted a bit of matter unlike any seen before. But even as they contemplate the implications, physicists are taking the result with a grain of salt. The supposed signal could be an experimental artifact, caution the researchers who found it. And if a new particle is there, physicists may have to perform theoretical contortions to explain why they didn't spot it before. “I think the result is rather inconclusive,” says Christopher Hill, an experimenter at Ohio State University in Columbus, who was not involved in the work.

The finding comes from the 700-member team working with the CDF particle detector at Fermi National Accelerator Laboratory (Fermilab) in Batavia, Illinois. The team analyzed the billions of collisions of protons and antiprotons produced by Fermilab's atom smasher, the 25-year-old Tevatron, which will shut down this year. Those high-energy collisions can blast into fleeting existence massive subatomic particles not seen in the everyday world. Physicists try to identify those particles by studying the combinations of familiar particles into which they decay.

In this case, experimenters searched for collisions that produced a particle called a W boson, which weighs about 86 times as much as a proton, along with some other particle that disintegrates into two sprays of particles called “jets.” A jet arises when a collision or decay kicks out a particle called a quark. A quark cannot exist on its own but must be bound to other quarks or an antiquark. So the energetic quark quickly rips more quarks and antiquarks out of the vacuum of empty space, and they instantaneously form particles called mesons, each containing a quark and an antiquark. From the energies and momenta of the two jets, researchers can infer the mass of the particle that produced them.

CDF researchers see about 250 events in which the jets seem to come from a particle weighing about 155 times as much as proton. Those events show up as an unexpected peak in a data plot (see diagram). The chances that random jets or jet pairs from other sources would produce a fake signal that strong are 1 in 1300, the physicists estimate. “We've been struggling for 6 months to make this peak go away, and we haven't been able to do it,” says Robert Roser, a physicist at Fermilab and co-spokesperson for the CDF team. Still, he says, the signal is “not even close” to strong enough to claim a discovery.

Experimenters have several reasons to be cautious. The analysis depends critically on physicists' understanding of jets. CDF does not measure every particle in a jet, so researchers must make a 25% upward correction to a jet's measured energy. If the uncertainty in that fudge factor is bigger than they estimate, “then maybe the excess isn't so significant,” says Shahram Rahatlou of Sapienza University of Rome.

CDF physicists must also take care that they haven't mistaken random pairs of jets for new particles. To see the peak, they must subtract out a huge “background” produced by events containing a W and random jets. If that subtraction isn't just right, it could produce a fake signal. “The real question is how well do we understand that [background],” says Joseph Lykken, a theorist at Fermilab.

But those caveats have not stopped theorists from trying to explain the curious bump in terms of new particles. Felix Yu, a theorist at the University of California, Irvine, suggests that the new particle could be one known as a Z′ (pronounced Z-prime), which would convey a new force much like a very short-range electromagnetic force. Estia Eichten, a theorist at Fermilab, and colleagues say the particle could be a “technipion,” a particle predicted by a type of theory called “technicolor,” which posits a new kind of strong nuclear force.

To have escaped notice until now, however, a particle would have to have some weird properties. Generally, a Z′ ought to decay into an electron and an antielectron. In fact, experimenters have already searched for and failed to find that decay. So Yu's Z′ must not decay that way for some reason. The technipion may face similar problems. CDF researchers are searching for the long-sought Higgs boson, the key to physicists' understanding of mass, by looking for events in which it is produced with a W boson and in which the Higgs decays into two jets specifically triggered by particles called bottom quarks. The hypothetical new particle hasn't shown up in those Higgs searches, so it must not often decay into bottom quarks, as one would expect a technipion to do.

For those reasons, some physicists say such explanations of the bump seem contrived or “unnatural.” “Yesterday, these models weren't popular,” Hill says. Yu counters that “having a theory that looks pretty but doesn't fit the data isn't natural.”

The supposed signal should be confirmed or ruled out in short order. The CDF team has analyzed only half of the data it has already collected. And the Tevatron's other large particle detector, D0, has a data set as big as CDF's. If the particle is there, D0 should see it, too. “We hope that within a few weeks you'll be hearing from us,” says Dmitri Denisov, a physicist at Fermilab and co-spokesperson for the D0 team. In the meantime, physicists will enjoy the buzz.

Tuesday, March 8, 2011

Doubts cast on Supersymmetry

Supersymmetry is thought as a beautiful model by particle physicists. However, "beauty" does not guarantee truth. Recent results from LHC runs it in trouble [http://www.nature.com/news/2011/110302/full/471013a.html?WT.ec_id=NATURE-20110303].

"Wonderful, beautiful and unique" is how Gordon Kane describes supersymmetry theory. Kane, a theoretical physicist at the University of Michigan in Ann Arbor, has spent about 30 years working on supersymmetry, a theory that he and many others believe solves a host of problems with our understanding of the subatomic world.

Yet there is growing anxiety that the theory, however elegant it might be, is wrong. Data from the Large Hadron Collider (LHC), a 27-kilometre proton smasher that straddles the French–Swiss border near Geneva, Switzerland, have shown no sign of the 'super particles' that the theory predicts13. "We're painting supersymmetry into a corner," says Chris Lester, a particle physicist at the University of Cambridge, UK, who works with the LHC's ATLAS detector. Along with the LHC's Compact Muon Solenoid experiment, ATLAS has spent the past year hunting for super particles, and is now set to gather more data when the LHC begins a high-power run in the next few weeks. If the detectors fail to find any super particles by the end of the year, the theory could be in serious trouble.

Monday, March 7, 2011

Dark Matter Particles Remain Dark

Here is a controversy about dark matter particles, whose properties are definitely quite elusive. Most we know about them are just speculative. Bear in mind how a scientific conclusion has to go through scrutinies !

Thursday, January 20, 2011

Reality Check at the LHC

A synopsis of what has been achieved in the beginning run of LHC:
http://physicsworld.com/cws/article/indepth/44805

Monday, December 13, 2010

The proton size

In a previous entry, I posted a report on the measurement of the proton size that is based on muons. It gives a smaller size than had been accepted. There came a new measurement, which, nonetheless, disagrees with this smaller proton saying. This new one is based on electrons. [http://physics.aps.org/synopsis-for/10.1103/PhysRevLett.105.242001]

The charge radius of the proton is one of nature’s fundamental parameters. Its currently accepted CODATA (Committee on Data for Science and Technology) value, 0.8768×10-15 m, has been determined primarily by measurements of the hydrogen Lamb shift and, to lesser accuracy, by electron-proton scattering experiments. This value has recently been called into question by a research team at the Paul Scherrer Institut (PSI) in Villigen, Switzerland. By measuring the Lamb shift in muonic hydrogen, these researchers obtained a value of 0.8418×10-15 m for the charge radius, five standard deviations below the CODATA value.

In a paper appearing in Physical Review Letters, the A1 Collaboration has determined the electric and magnetic form factors of the proton with higher statistics and precision than previously known, using the Mainz (Germany) electron accelerator MAMI (Mainz Microtron) to measure the electron-proton elastic-scattering cross section. Both form factors show structure at Q2 mπ2 that may indicate the influence of the proton’s pion cloud. But, in addition, the collaboration’s extracted value for the charge radius agrees completely with the CODATA value. The discrepancy between “electron-based” measurements and the recent PSI “muon-based” measurement thus remains a puzzle. – Jerome Malenfant

Tuesday, November 9, 2010

How charge is renormalized by gravity

Electrical charge (the bare one), g, measures the coupling strength between electrons and photons. In QED, g is a constant. However, if interactions of QED fields with other fields (particles) are taken into account, the g shall be renormalized in the sense of renormalization group theory. In this article [doi:10.1038/nature09506], the author looks at how gravitational field renormalizes the g. In his treatment, there assumes a cutoff, below which the Einstein's theory is a reasonable starting point for quantization. Going through the usual RG procedures, he arrives at the statement that, gravity results in QED asymptotic freedom at high energy scales: g tends to zero at very large energy.
The first term on the right hand side of equation (12) is that present in the absence of gravity (found by letting kR0) and results in the electric charge increasing with energy. The second termis the correction due to quantum gravity. For pure gravity with L50, or for a small value of L as suggested by present observational evidence40, the quantum gravity contribution to the renormalization group b-function is negative and therefore tends to result in asymptotic freedom, in agreement with the
original calculation13.

Sunday, November 7, 2010

Still Quiet is Dark Matter

Cosmological observations suggest the existence of dark matter, which has not shown any traces of interacting with known baryonic matter. Yet, dark matter comprises over 80% of the total matter needed to explain the space-time structure. Scientists have not a clue regarding the nature of these matter. One proposal says they may be made of sort of particles, the so-called WIMPs (weakly interacting massive particles). Various experiments have been devised to detect them. NO positive results exist up to now. A latest effort came in PRL, still no activities of these particles detected. They are really quiet, should they be there. [Phys. Rev. Lett. 105, 131302 (2010)]
The XENON100 experiment, in operation at the Laboratori Nazionali del Gran Sasso in Italy, is designed to search for dark matter weakly interacting massive particles (WIMPs) scattering off 62 kg of liquid xenon in an ultralow background dual-phase time projection chamber. In this Letter, we present first dark matter results from the analysis of 11.17 live days of nonblind data, acquired in October and
November 2009. In the selected fiducial target of 40 kg, and within the predefined signal region, we observe no events and hence exclude spin-independent WIMP-nucleon elastic scattering cross sections above 3:4 10 44 cm2 for 55 GeV=c2 WIMPs at 90% confidence level. Below 20 GeV=c2, this result
constrains the interpretation of the CoGeNT and DAMA signals as being due to spin-independent, elastic, light mass WIMP interactions.

Wednesday, October 27, 2010

Cloud Chamber

A cloud chamber (CC) is used to detect tiny radiation particles. It was invented by Wilson and won him a Nobel Prize. The operating principle is simple: soak a chamber with alcohol and then seal it, and cool it down. The supercooled air shall enter a metastable state but the alcohol evaporation get ready to condense, which can be ignited by a small perturbation (nucleation centers). When a particle passes through this chamber, liquid drops shall track it and make it detectable.

This is a Video demonstrating how to make a simple cloud chamber:
http://education.jlab.org/frost/cloud_chamber.html

Note: a metastable state is a state that is stable but not robust against any perturbations.

Saturday, October 23, 2010

Curved space generating mass ?

Since Newtonian times, physicists have to talk of mass, a quantity having its origin a deep myth. In Newtonian mechanics, mass is impedes the change of velocity. In relativity, mass (in the conventional sense) is no more than the static energy. In non-relativistic quantum mechanics, mass plays to make the entity more of a particle. In relativistic quantum mechanics, mass is the energy required to generate a pair of electron and positron. In condensed matter physics, mass is the minimum energy to excite a system. Besides, mass also measures some correlation length.

Although we know many things about mass, we don't have a clear clue where mass comes from. In the standard model, all masses are produced by Higss mechanism: every mass-less particle moves in some kind of ether that is the Higgs clouds and acquires mass. Another idea is, mass can be generated by curved space, or more accurately, compactified dimension. Compactifying a dimension yields finite motion, one that is confined. According to quantum mechanics, finite motion implies discrete levels and finite gaps, so comes the mass. Yet, a clear regime is missing.

Graphene provides a playground for studying this regime. These authors roll the graphene and obtain a massive 1D system from a 2D massless Dirac system [http://arxiv.org/ftp/arxiv/papers/1010/1010.3437.pdf]. This is no surprising and actually was known before. But this is an example showing how mass might be generated this way. However, back to elementary particle physics, where is the hidden dimension in addition to the 4D space-time we are all used to ? Another question is, how the as-obtained masses interact in a gravitational way ? Anyway, mass should be gravitationally active !!!!

Thursday, September 23, 2010

NO dark matter detected, yet

Without a wisp of exaggeration, the greatest myth in present physics might be about the so-called dark matter and dark energy. Physicists, fairly speaking, for the moment have not even the slightest definite clue about them. They were motivated for two observations: (1) the rotation velocity of a typical galaxy does not follow the pattern based on Newton's theory; (2) the universe is expanding faster and faster. Fact (1) leads to proposal of dark matter while (2) to that of dark energy. Interestingly, the dark energy term was first hypothesized by Einstein, who later on dismissed it for Hubble's discovery, to find a static and stable universe. This energy never dilute in the course of expansion. It permeates everywhere. People don't know where it comes from, although some suggested it might be vacuum energy (calculations rejected this idea). As regards the dark matter, it is usually hypothesized as some undetected particles other than baryons. They interact extremely weakly with visible matter. Some suggest these might be the so-called Weakly Interacting Massive Particles that are predicted by supersymmetric theory. Detectors have been mounted to settle this issue. A latest survey reports a failure [Phys. Rev. Lett. 105, 131302 (2010) ].

Although the above dark matter idea is popular, it is quite dubious to some physicists, who don't like extra assumptions. In 2004, a German group did a study which reveals running gravitational constant that goes bigger at astronomical scales [Physical Review D 70: 124028 (2004)]. This study might null the necessity of dark matter.

Tuesday, September 21, 2010

An idea about why gravitational force is weak

The gravitational force is rather weak under usual conditions. This fact is simply accepted and not much inquired. Today, in a colloquium with a very small circle of friends, we came up with a very eccentric idea. Here I would like to make just a record. It is drastically different from the conventionally held view. Since it is very new, it is in its infant stage. However, it is very attractive for these reasons:

(1) Let's assume that, all fundamental fermion particles are spin half and with charges. But no primary mass shall be particularly imposed.

(2) The particles don't interact with each other directly, but rather via bosons. The charges directly couple to photons, as indicated by the common Feynman diagrams, which an incoming charge and an outgoing charge meet with a photon. Due to the conservation of spin, it is necessary that the photon has spin 1.

(3) As widely held, the quantum of the gravitational field, which is represented by a metric tensor, should have spin 2. Now if two spin half charges interacted with a graviton in a way similar to that with a photon, the conservation of spin would be breached. Therefore, it is worth thinking about a simple way out: let the charges couple directly to photons while the photons couple to graviton. In this case, the spin can be conserved.

(4) Now what interesting may be inferred ? First, the masses of the charges emerge rather than be endowed. Second, the gravitational force is higher order effects, explaining its fragility. Third, a connection between mass and electrons may be established.

The thus-described idea is ridiculous, and worth thinking more.