LHC May Have Found Crack in Modern Physics

n late 2008, a few onlookers believed that the Large Hadron Collider (LHC) would bring the end of the world. Three years later, our planet remains intact, but the European particle smasher may have made its first crack in modern physics. If this crack turns out to be real, it might help explain an enduring mystery of the universe: why there’s lots of normal matter, but hardly any of the opposite—antimatter. “If it holds up, it’s exciting,” says particle physicist Robert Roser of the Fermi National Accelerator Laboratory in Batavia, Illinois.

To understand why physicists are excited, look around: We’re surrounded by stuff. That might seem obvious, but scientists have long wondered why there’s anything at all. Accepted theories suggest that the big bang should have produced equal amounts of matter and antimatter, which would have soon annihilated each other. Clearly, the balance tipped in favor of normal matter, allowing the creation of everything we see today—but how, no one’s sure.

Most probably, theorists say, the properties of matter and antimatter aren’t quite symmetrical. Technically, this disparity is known as charge-parity (CP) violation, and it should crop up when particles naturally decay: either normal particles would decay more often than their antiparticles do or vice versa. According to the accepted theory of elementary particles, the standard model, there should be a low level of CP violation but not enough to explain the prevalence of normal matter. So experiments have been trying to find cases in which CP violation is higher.

That’s where LHCb, one of six detectors at the LHC, may have been successful. It has been tracing the paths of particles known as D0 mesons, which, along with their antiparticles, can decay into pairs of either pions or kaons. By tallying these pions and kaons, the LHCb physicists have calculated the relative decay rates between the D0 particles and antiparticles. The result, revealed at a meeting in Paris this week, is startling: the rates differ by 0.8%.

On the face of it, this level of CP violation is at least eight times as high as the standard model allows, so it could help explain why there is still “stuff” in the universe. But there’s a caveat: It’s not precise enough. For true discoveries, physicists demand a statistical certainty of at least five sigma, which means there should be less than one chance in 3 million of the result’s being a random blip in the data. Currently, the LHCb team has a certainty of three sigma, so there’s about one chance in 100 the result is a fluke.

Matthew Charles, a physicist at the University of Oxford in the United Kingdom and spokesperson for the 700-strong LHCb collaboration, is naturally cautious. “The next step will be to analyze the remaining data taken in 2011,” he says. “The sample we’ve used so far is only about 60 percent of what we’ve recorded, so the remainder will improve our precision quite a bit and will give us a strong clue as to whether the result will hold up.” For that analysis, the public will have to wait until next year.

Particle physicist Paul Harrison of the University of Warwick in the United Kingdom, who works on other LHCb studies, isn’t getting his hopes up. “I’m not betting my pension on this result standing the test of further data,” he says. He thinks the uncertainty is simply too big. “Since we are measuring hundreds of different things at the LHC, then every so often one of them will give a three-sigma effect like this at random.”

There are reasons to be positive, though. Last year, the CDF collaboration based at Fermilab reported a similar difference between the D0 decay rates of 0.46 percent. At the time, the result was thought likely to be a blip because CDF’s statistical uncertainty was fairly big, but taken together with the LHCb result, it might be seen to carry more weight. And CDF, like LHCb, still has more data to trawl through. “We are now obviously very motivated to extend our analysis to our full data sample and see if we can get an independent confirmation of the LHCb result,” says Giovanni Punzi of the University of Pisa in Italy, a spokesperson for the CDF collaboration.

لینک منبع مقاله

Pristine relics of the Big Bang spotted

For the first time, astronomers have discovered two distant clouds of gas that seem to be pure relics from the Big Bang. Neither cloud contains any detectable elements forged by stars; instead, each consists only of the light elements that arose in the Big Bang some 14 billion years ago. Furthermore, the relatively high abundance of deuterium seen in one of the clouds agrees with predictions of Big Bang theory. Just after the Big Bang, nuclear reactions created the three lightest elements – hydrogen, helium, and a tiny bit of lithium. Stars then converted some of this material into the heavy elements such as carbon and oxygen that pepper the cosmos today.

But no-one has ever seen a star or gas cloud made solely of these three Big Bang elements. Instead, all known stars and gas clouds harbour at least some "metals", the term astronomers use to describe any element, even carbon and oxygen, that is heavier than helium.

Minutes after the Big Bang

Now, Michele Fumagalli and Xavier Prochaska of the University of California, Santa Cruz and John O'Meara of Saint Michael's College in Vermont, have found two pristine gas clouds. "Their chemical composition is unusual," says Fumagalli. "This gas is of primordial composition, as it was produced during the first few minutes after the Big Bang."

One gas cloud resides in the constellation Leo, the other in Ursa Major. The Leo cloud has a redshift – a measure of its distance – of 3.10, which means it is 11.6 billion light-years from Earth. The Ursa Major cloud is slightly farther away, with a redshift of 3.41 and at a distance of 11.9 billion light-years. We therefore see both clouds as they were about two billion years after the Big Bang. The clouds are far too faint to observe directly. Fumagalli and colleagues discovered them only because the clouds happen to lie in front of even more distant quasars, which are luminous galaxies that were much more common long ago. Atoms in the gas clouds absorb some of the light from the background quasars, and the wavelengths at which absorption is evident reveals important information about the composition of the clouds.

Hydrogen only

Despite using the mammoth Keck I telescope atop Mauna Kea in Hawaii, the astronomers failed to find any element except hydrogen in the two clouds. While the researchers also expect helium and lithium to be present, their technique is not sensitive to those elements. However, if oxygen, carbon or silicon were present, it should have been easy to spot. From this, the researchers deduce that the metal-to-hydrogen ratio (or metallicity) in the Leo cloud is less than 1/6000th of that of the Sun and in the Ursa Major cloud is less than 1/16,000th of the Sun's metallicity.

In comparison, ancient stars in the Milky Way's most primitive population – the stellar halo – typically have metallicities that are 1/50th of that of the Sun. The most metal-poor halo star known has a metallicity 1/22,000th that of the Sun, which is similar to the upper limit of the two gas clouds.  "It's a very interesting discovery," says Nick Gnedin, an astronomer at Fermilab in the US, who is unaffiliated with the discovery team. Gnedin says it has been very difficult to understand why all other gas clouds – even those at greater distances – contain metals. These newfound exceptions should help astronomers understand why all other gas clouds contain metals, he says.

Agrees with Big Bang predictions

In addition to ordinary hydrogen, Fumagalli and colleagues detected the hydrogen isotope deuterium in the Ursa Major cloud. Physicists believe that the Big Bang produced deuterium but that stars then destroyed it – so the universe once had more deuterium than it does today.  The high deuterium-to-hydrogen ratio in the gas matches Big Bang predictions. "The fact that we see deuterium that is comparable to what is expected from theory is giving us more confidence that this gas is actually primordial in its composition," says Fumagalli.

"Nice connection"

Rob Simcoe of the Massachusetts Institute of Technology says that the two clouds show that pockets of the universe remained free of stars and their ejecta for some two billion years after the Big Bang. "This is a nice connection between work that is being done on the early universe using these gas clouds and work that is being done in our backyard, in the stellar halo of the Milky Way, where people have discovered stars that have comparably low chemical abundances," he says.

Each gas cloud has only about a millionth of the Milky Way's mass. Simcoe suspects that each will eventually fall onto a galaxy and form stars. If one of those galaxies now has astronomers, they may be peering at the nascent Milky Way and seeing primitive gas clouds that spawned stars in our own galaxy's stellar halo.

انرژی مغز از کجا می‌آید؟

قریب به نیم‌میلیون سال پیش، مغز ما انسان‌ها شروع به رشد کرد و مغز بزرگ‌تر هم طبعاً به انرژی بیشتری برای دوام آوردن نیاز دارد، اما دانشمندان از همین متعجب هستند که وقتی‌ آهنگ سوخت و ساز بدن انسان که تأمین‌گر انرژی‌اش است، با عموزاده‌های‌مان که مغزی به ابعاد یک نخود داشتند هیچ تفاوتی نمی‌کند، پس این مقادیر اضافی انرژی از کجا آمده است؟

بر اساس یکی از فرضیات اخیر، نیاز انرژی مغزمان توسط روده کوچک تأمین می‌شود، چراکه غذایی که راحت‌تر هضم می‌شود، انرژی روده را برای رشد مغز آزاد می‌کند، اما پژوهش‌های تازه مدعی‌اند که با توجه به اهمیت بیشتر انرژی موجود در ذخیره چربی بدن، این فرضیه ممکن است اصلاً صحیح نباشد. "حیواناتی که مغز بزرگ‌تری دارند، از بافت‌های نازک‌تر چربی هم برخوردارند و حیواناتی که بافت چربی‌شان برتری دارد، ابعاد مغزشان کوچک‌تر است." این را آنا ناوارته (Ana Navarrete) از دانشگاه زوریخ سوئیس می‌گوید و می‌افزاید: "یا شما مغز بزرگ‌تری دارید و یا بافت غنی‌تری از چربی. معمولاً این‌ دو در نسبت معکوس با همدیگرند."
 
مغز، در حدود ۲۲ برابر یک ماهیچه انرژی نیاز دارد تا هر دو با ظرفیت به نسبت برابری واکنش نشان دهند و این انرژی هم از غذایی که می‌خوریم تأمین می‌شود. ابعاد مغز ما انسان‌ها در حدود سه برابر ابعاد مغز نزدیک‌ترین نمونه‌های جانوری به ما، یا شامپانزه‌ها هست و مصرف انرژی‌اش سه برابر بیشتر است، حال آن که نرخ سوخت و ساز هر دو گونه مشابهت دارد. پس این انرژی اضافی، باید از جایی تأمین شده باشد و ناوارته و همکارانش هم برای یافتن این‌جا، دست به بررسی ۱۹۱ نمونه از یکصد پستاندار مختلف زدند. هدف این بود که فرضیه "بافت غنی" (که می‌گفت این انرژی از پس‌مانده‌های ذخیره انرژی روده کوچک به‌دست می‌آید) را به مصاف فرضیه‌‌ای دیگر بفرستند که انرژی مزبور را به غنای بافت چربی بدن نسبت می‌داد. نتیجه آن شد که نسبت مستقیمی نه مابین ابعاد مغز و روده، بلکه بین مغز و میزان چربی موجود در بافت‌های بدن جانداران پیدا شد و در جانواران وحشی و جنس ماده نیز از همه بارزتر بود. به‌علاوه، این نسبت برای ۲۳نمونه اول اصلاً صدق نمی‌کرد، شاید از آنجا که همگی این نمونه‌ها، مربوط به حیوانات به اسارت درآمده‌ای بودند که ذخیره چربی‌شان یا بیشتر یا کمتر از هم‌خانواده‌های وحشی‌شان بود.
 
ذخیره بالای چربی، باعث کاهش سرعت جاندار حین حرکاتی از قبیل صعود به ارتفاع، پرواز یا فرار از دست صیادان می‌شود، اما ذخیره انرژی خوبی هم به‌شمار می‌رود. این مسئله حکایت از این می‌کند که به‌گفته ناوارته، بقای جاندار در گرو دو راهبرد است: یا برای اوقات دشوار زندگی، چربی بیشتری که به‌معنای ذخیره انرژی بیشتری هم می‌شود ذخیره کن، یا مغز بزرگ‌تری داشته باش و به فکر رهایی از این ناملایمات بربیا.
 
اما انسان‌ها، هم مغز بزرگی دارند و هم چربی فراوان و این بدین‌معناست که در ناملایمات زندگی، برای بقا به هر دو گزینه می‌توان متکی بود. ناوارته مدعی‌ست که چیز دیگری هم در این میان از قلم افتاده که به‌نحوی مربوط به رهیافت بهینه ما انسان‌ها در امر تحرک می‌شود. هرچند که او نمونه‌‌های آزمایشگاهی مربوط به انسان را مورد بررسی قرار نداده، ولی معتقد است که ما به‌واسطه سبک کاملاً متفاوت جابجایی‌مان نسبت به سایر جانداران، عادات پیشین حرکت را شکسته‌ایم. راه رفتن روی دو پا، نسبت به پرسه زدن با هر چهار دست و پا در لابه‌لای درختان همانند شامپانزه‌ها، از حیث مصرف انرژی، فرآیند جابجایی به‌صرفه‌تری‌ محسوب می‌شود. پس از آنجایی که داشتن ذخیره چربی برای ما انسان‌ها گران تمام نمی‌شود، هم می‌توانیم از این ذخیره بهره بجوییم و هم اینکه از مغزمان برای یافتن منابع غذایی کمیاب‌تر بهره بگیریم.
 
با این‌ وجود، نبود نمونه‌های انسانی در این پژوهش، برخی از دانشمندانی که در آن نقشی نداشته‌اند را نگران کرده و می‌گویند امکان دارد تفاسیر ناوارته، مبالغه‌آمیز باشد. جک بیکر (Jack Baker)، پژوهشگری از دانشگاه نیومکزیکو که در این پژوهش هیچ نقشی نداشته است، می‌گوید: "این نتایج حاکی از آن است که در انسان‌های نخستین، ذخیره چربی، به نسبت رشد ابعاد مغز اصلاً فدا نشده است. اهمیت این پژوهش، صرفاً حول محور ارتباط مابین این نتایج و فرضیه بافت غنی می‌چرخد – که تقریباً مختص خاستگاه انسان است. حال‌آنکه در این پژوهش هیچ نمونه‌ای از انسان مورد بررسی قرار نگرفته است." لزلی آیلو (Leslie Aiello)، پژوهشگری از بنیاد Wenner-Gren نیویورک هم که در این پژوهش نقشی نداشته است، می‌گوید: "با این‌حال، ناوارته و همکارانش، مجموعه داده‌های بی‌سابقه‌ای را گرد آورده‌اند که پیشرفت چشمگیری نسبت به آنچه ۲۰ سال پیش در اختیارمان بود، به حساب می‌آید."
 
به‌گفته آلیو، این مجموعه داده‌ها، اطلاعات بیشتری را به معمای تکامل مغز آدمی می‌افزاید، اما سئوالات و پیچیدگی‌هایی را هم ایجاد می‌کند که پاسخ واحدی نمی‌شود به آن‌ها داد و باید از مجموع دانسته‌های‌مان درباره ابعاد روده کوچک، نسبت بالای ذخیره چربی انسان، سبک خاص جابه‌جایی‌مان و دیگر مؤلفه‌های قابل بحث، پاسخی بیرون کشید. این پژوهش، در نشریه علمی nature منتشر شده است.

کشف قوی‌ترین تپ‌اختر پرتو گاما

کشف یک میدان مغناطیسی فوق‌العاده قدرتمند در گستره یک خوشه کروی از ستارگان، وجود تپ‌اخترهای پرتو گامای شدیداً پرقدرتی که وجودشان تا پیشتر حتی تصور هم نمی‌شد، به تأیید تجربی رسید.

طبق اظهارات پژوهشگرانی که مقاله خود را امروز جمعه، در نشریه علمی Science انتشار داده‌اند، آن‌ها با کمک تلسکوپ مستقر بر رصدخانه فضایی پرتو گامای فرمی، موفق به تشخیص این تپ‌اخترهای میلی‌ثانیه‌ای (یا به اختصار MSP) شده‌اند که در هر دقیقه حدود 43 هزاربار به گرد خود می‌چرخند. این کشف، درک کنونی‌مان از فیزیک حاکم بر مواد فوق چگال و نیروهای مغناطیسی پراکنده در پهنه کهکشان را ارتقا خواهد بخشید. این تپ‌اختر شگفت‌انگیز، هم‌اکنون PSR J1823-3021A نام دارد.
 
"نور خوشه‌های کروی، مثل آواز نغمه‌سرایان فراوانی‌ست که هرکدام‌شان صدای متفاوتی با ویژگی‌های مختص به خودشان را سر می‌دهند و ما قادر به تمایز صداهای منفرد از یکدیگر نیستیم. اما PSR J1823-3021A فرق می‌کند: کار این تپ‌اختر شبیه به یک تک‌نوازی است که تمامی آهنگ‌های پرتو گامای خوشه میزبانش را خودش تولید می‌کند. اینکه می‌شود از چنین فاصله دوردستی این صدا را شنید، حقیقت شگفت‌انگیزی‌ست." این را تایرل جانسون (Tyrel Johnson) پژوهشگر همکار بنیاد ملی علوم ایالات متحده و شاغل در پژوهشگاه نیروی دریایی در واشنگتن، که از نویسندگان مقاله مزبور هم بوده است می‌گوید و می‌افزاید: "درک اینکه چگونه این MSP که میدان مغناطیسی نامتعارف‌گونه و نوسانات پرتو گامای درخشان‌اش آن را سردسته این نغمه‌سرایان کیهانی کرده، ایجاد شده است به ارتقای دانسته‌هایمان در خصوص فیزیک حاکم بر میدان‌های مغناطیسی نامتعارفی که نمی‌توانیم آن‌ها را در آزمایشگاه‌های زمینی بازسازی کنیم، کمک شایان توجهی خواهد کرد."
 
تپ‌اخترها، فانوس‌های دریایی کیهان
یک تپ‌اختر (یا ستاره‌ای که نورش تپش می‌کند)، در حقیقت ستاره نوترونی فوق فشرده‌ای با میدان مغناطیسی شدیداً قوی است که در جریان چرخش‌های دوره‌ای‌اش، از خود تابش الکترومغناطیسی گسیل می‌کند. کشف تپ‌اختر میلی‌ثانیه‌ای J1823-3021A در خوشه کروی NGC6624، و تأییدش به‌عنوان یک منبع درخشان گسیل پرتوهای گاما، مهم‌ترین کشف تلسکوپ پرتو گامای فرمی، از زمان پرتابش به فضا در سال 2008 میلادی به‌شمار می‌رود.
 
خوشه‌های کروی، به‌نظر حاوی برخی از سالخورده‌ترین ستاره‌های کهکشان هستند، هرچند که منشاءشان هنوز مشخص نیست. تپ‌اخترهای پرتو گاما، پیشتر هم مشاهده بودند و گاه حتی تاریخچه‌شان به سال 1665 میلادی هم قد می‌داد، اما تا به امروز، صرفاً تشخیص پرتوهای گامای گسیلی از آن‌ها نقشی در کشف‌شان نداشته است، چراکه این پرتوها تاحدودی مؤلفه‌ای فرعی محسوب می‌شده‌اند.
 
دوره چرخش تپ‌اخترهای میلی‌ثانیه‌ای، در بازه یک تا 10 میلی‌ثانیه قرار دارد و به علت همین فرکانس بالای تپش‌های‌شان، از دید تلسکوپ فرمی، به شکل منابعی با پرتوهای ممتد گاما دیده می‌شوند. البته تنها در صورتی‌که مسیر پرتوها هم رو به سوی زمین داشته باشد و همین محدودیت، که به "اثر فانوس دریایی" معروف است، باعث شده تا با چرخش تپ‌اختر و خروج متناوب مسیر پرتوهایش از دیدرس ما، نورش با تپش‌های متوالی به ما برسد.
 
اجساد رقصان و آوازه‌خوان
اکثر تپ‌اخترهای میلی‌ثانیه، در خوشه‌های کروی پیدا می‌شوند، اجتماعاتی کروی‌شکل و فوق فشرده از صدها هزار ستاره که با نیروی گرانشی‌شان محکم همدیگر را گرفته‌اند و همگی در مداری به گرد هسته کهکشان می‌چرخند. این خوشه‌ها را همیشه در نواحی پیرامون مرکز کهکشان می‌توان مشاهده کرد و سن ستاره‌های‌شان، اغلب از 10 میلیارد سال تجاوز می‌کند. تابه‌حال، در حدود 158 خوشه کروی در گراگرد مرکز کهکشان راه شیری پیدا شده است.
 
تپ‌اختر J1823-3021A، جوان‌ترین MSP رصدشده تا به امروز است و به‌گفته بروس آلن (Bruce Allen)، مدیر مؤسسه پژوهش‌های فیزیک گرانشی ماکس پلانک آلمان که در این پژوهش دستی نداشته، از اهمیت زیادی هم برخوردار است، چراکه "خوشه‌های کروی، گورستان تپ‌اخترهای پیر و مرده‌اند. فقط تصورش را بکنید که حین قدم زدن در یک قبرستان، به‌جای تماشای استخوان‌های خاک‌گرفته، جسدی را می‌بینید که می‌رقصد و آواز سر می‌دهد".
 
منابع پرقدرت پرتوهای گاما را معمولاً از روی تپش‌های طولانی‌مدت نور ضعیف‌شان که با دوره‌های تناوب 1.4 تا 8.5 میلی‌ثانیه رخ می‌دهند، تشخیص می‌دهند. نوری چنان ضعیف که تا پیش از پرتاب تلسکوپ فرمی، به‌شکل منفرد اصلاً تشخیص داده نشده بودند.
 
تأیید این‌که خوشه کروی NGC6624، از یک MSP واضح و مشخص میزبانی می‌کند، در محاسبه مؤلفه‌های مداری و جرم این تپ‌اختر به دانشمندان کمک شایان توجهی خواهد کرد، چراکه بازسازی چنین شرایطی در آزمایشگاه‌های زمینی تقریباً غیرممکن است. جانسون می‌گوید: "تا پیش از پرتاب فرمی، اصلاً معلوم نبود که آیا MSPها را می‌شود به‌عنوان منابع پرقدرت پرتو گاما هم محسوب کرد، یا نه." آلن نیز می‌گوید: "کشف این تپ‌اختر میلی‌ثانیه‌ای، کشفی فوق‌العاده شگفت‌انگیز است و مقاله مزبور، به‌‌شکلی متقاعدکننده مدعی است که باید اجرام بیشتری از نوع را بتوان در آینده یافت. این [کشف]، دیدگاه‌مان را نسبت به ماهیت حیواناتی که در باغ وحش تپ‌اخترهای کیهان پرسه می‌زنند، تغییر می‌دهد." با وجود همیاری‌های موجود در بین جامعه دانشمندانی که بر داده‌های خام تلسکوپ فرمی کار می‌کنند، طبعاً پژوهشگرانی هستند که خواهان بازبینی در فرضیات گاه متناقض موجود در خصوص منشأ و سیر تحول خوشه‌های کروی هم باشند.
 
جانسون می‌گوید: "سن و انرژی تپ‌اخترها را می‌توان از روی زمان‌سنجی‌های رادیویی و سپس چندین معادله ساده و استاندارد به‌دست آورد و همین‌ داده‌ها هم به ما گفته بودند که این تپ‌اختر، به‌طرز شگفت‌انگیز جوان و پرانرژی است. با این‌حال گمان می‌کردیم که این زمان‌سنجی‌های رادیویی، به تبع میدان گرانشی فوق‌العاده پرقدرت خوشه، دچار تغییراتی شده است، اما محاسبات فرمی از تابش شدید پرتوهای گاما توسط این تپ‌اختر به ما نشان داد که اعداد، هیچ عیبی ندارند. لذا J1823-3021A، یک تپ‌اختر نوزاد است که در فرآیندی مرموز به وجود آمده و خود را در میان یک قوم سالخوره پنهان کرده است."
 
 

Norman Ramsey: 1915–2011

The US physicist Norman Ramsey, who shared the 1989 Nobel Prize for Physics, died on 4 November at the age of 96. Ramsey's work on probing the structure of atoms to high precision was instrumental in the later development of the atomic clock, as well as in medical applications such as magnetic resonance imaging (MRI), which is now widely used to image nuclei of atoms inside the body.

Ramsey's pioneering work in the 1940s followed that of his PhD supervisor – Nobel laureate Isidor Isaac Rabi from Columbia University. In 1937 Rabi invented a technique called atomic-beam magnetic resonance to study the structure of atoms by probing their transition levels with radiation. This technique involves passing a beam of atoms through a homogeneous magnetic field before subjecting it to a single oscillating electromagnetic field, which is set to a frequency that induces transitions between certain energy levels in the atoms. The radiation emitted has a characteristic frequency or wavelength that depends on the energy difference between the two levels.

Any inhomogeneity in the magnetic field, however, was found to widen the resonance line and therefore have a negative impact on the accuracy of the experiment. In 1949 Ramsey modified Rabi's method by introducing two separated oscillatory fields. This means that the atoms can be excited in either one of the two regions, thus producing an interference pattern with a sensitivity that depends on the distance between the two oscillatory fields but that is independent of the degree of homogeneity of the magnetic field between them. This made it possible to greatly improve the accuracy of the Rabi's method – reducing the width of the transition spectral line by as much as 35%.

Ramsey's breakthrough allowed more precise measurements of atomic-energy spectra and has subsequently been used in caesium clocks, which have provided our standard of atomic time since 1967. His technique also provided the basis for nuclear-magnetic-resonance spectroscopy and MRI. In 1960 Ramsey and colleagues also began to develop the hydrogen maser – a device that produces coherent electromagnetic waves through amplification by stimulated emission – together with Daniel Kleppner from the Massachusetts Institute of Technology.

It was for this work – the "invention of the separated-oscillatory-fields method and its use in the hydrogen maser and other atomic clocks" – that Ramsey was awarded half of the 1989 Nobel Prize for Physics. The other half was shared by Hans Dehmelt from the University of Washington and Wolfgang Paul from the University of Bonn, Germany, "for the development of the ion-trap technique".

Born on 27 August 1915 in Washington, DC, Ramsey went on to study mathematics at Columbia College in New York and graduated in 1935. He then moved to Cambridge University in the UK, where he obtained a second Bachelor's degree – this time in physics – before heading back to Columbia to do a PhD in the new field of magnetic resonance that was supervised by Rabi. Ramsey stayed on at Columbia until 1947, before moving to Harvard University where he spent the remainder of his career before retiring in 1986.

مکانیک کوانتومی و نسبیت عام در یک تقاطع

متحد نمودن مکانیک کوانتومی و نسبیت عام یکی از مهیج ترین سوالات باز در فیزیک نوین است. نسبیت عام، نظریه یکپارچه گرانش، پیش بینی هایی در مورد فضا و زمان دارد که در مقیاس های کیهانی ستارگان و کهکشان ها آشکار می گردد. از سویی دیگر، اثرات کوانتومی ضعیف هستند و نوعا در مقیاس های کوچک مثلا اتم ها و ذرات منفرد مشاهده می‌شوند. به همین دلیل است که آزمایش اثر متقابل مکانیک کوانتومی و نسبیت عام دشوار است. اکنون، فیزیکدانان نظری با هدایت چاسلاو بروکنر(Chaslav Brukner) در دانشگاه وین آزمایش بدیعی را پیشنهاد نموده اند که می‌تواند عرصه مشترک این دو نظریه را بیازماید. تمرکز این کار اندازه گیری ماهیت نسبیت عامی زمان در مقیاس کوانتومی است.

یکی از پیش بینی های نسبیت عام این است که گرانش درگذر زمان موثر است. نظریه پیش بینی می کند که ساعت هایی که در نزدیکی یک جسم جرم کار میکنند از ساعتهایی که از آن دورترند کندتر تیک تاک می‌کنند. این اثر منجر به «باطل نمای دوقلوها» می شود: اگر یکی از قل ها در ارتفاع زندگی کند، سریع تر از دوقلوی دیگر که روی زمین می ماند، پیر می شود. این اثر در آزمایش های کلاسیکی به خوبی تایید شده است، اما با مکانیک کوانتومی هنوز نه، و این هدف آزمایش پیشنهادی جدید است.

گروه پژوهشگران وین می خواهند این احتمال غیرعادی را بررسی کنند که یک ذره کوانتومی می تواند ویژگی کلاسیکی داشتن مکان خوش تعریف را از دست بدهد، یا آن طور که با عبارت مکانیک کوانتومی گفته می شود: در یک «برهم نهی » باشد. این امر منجر به اثرات «موجی» (که «تداخل» نامیده می شوند) با یک ذره می گردد. هرچند، اگر مکان ذره اندازه گیری شود یا حتی اگر اصولا قابل مشخص شدن باشد، این اثر از دست می رود. به عبارت دیگر، امکان ندارد که تداخل را مشاهده کنیم و به طور همزمان مکان ذره را بدانیم. چنین ارتباطی میان اطلاعات و تداخل، مثالی از مکملیت کوانتومی (Quantum Complementarity) است.

گروه دانشگاه وین ساعتی را در نظر میگیرد (هر ذره ای با درجه آزادی داخلی همانند اسپین) که در برهم نهی دو مکان یکی نزدیک تر و دیگری دورتر از سطح زمین قرار دارد . بر اساس نسبیت عام، ساعت در مکان های متفاوت با آهنگ های مختلفی تیک می زند؛ همان طور که دو قلوها به شکلی متفاوت پیر می شوند. اما از آن جا که زمان اندازه گیری شده توسط یک ساعت معلوم می کند که ساعت در کجا قرار دارد، تداخل و سرشت موجی ساعت از دست می رود. ماگدالنا زیچ(Magdanela Zych) نویسنده اصلی مقاله و عضو برنامه دکترای وین CoQus می گوید: «این باطل نمای دوقلوها برای یک فرزند کوانتومی است و حل آن نیاز به نسبیت عام و مکانیک کوانتومی دارد! چنین برهمکنشی میان این دو نظریه هرگز قبلا آزموده نشده است.» از این رو است که این آزمایش به ما امکان می دهد تا سرشت زمان را از دید نسبیت عام با مکانیک کوانتومی بیازماییم!

LHC trials proton–lead collisions

Physicists at CERN's Large Hadron Collider (LHC) are analysing the results of their first attempt at colliding protons and lead ions. Further attempts at proton–lead collisions are expected over the next few weeks. If these trials are successful, a full-blown experimental programme could run in 2012.

Since the Geneva lab began experiments with the LHC in 2009, it has mostly been used to send two beams of protons in opposite directions around the 27 km accelerator, with the hope of spotting, among other things, the Higgs boson in the resulting collisions. Two beams of lead ions have also been smashed into each other in order to recreate the hot dense matter, known as a quark–gluon plasma, that was present in the early universe. But to fully understand the results of such collisions, physicists need to know the properties of the lead ions before they collide. That is, their "cold state" before vast amounts of heat are released by the collisions. One way to do this, according to Urs Wiedemann at CERN, is to collide protons with lead ions.

Parton distribution

The problem at the moment is that our knowledge of the "parton distribution functions" for high-energy lead ions is not good enough to fully understand the results of lead–lead collisions at the LHC. Partons are the quarks and gluons that make up hadrons such as protons and neutrons – and hence the lead nuclei. At low energies hadrons can be thought of as containing just three valence quarks that interact via gluons. However, at the energies found in the LHC, hadrons comprise a large number of additional partons that can significantly affect how collisions occur. This "sea" of partons is described by a distribution function, which cannot be calculated to the desired degree of accuracy – so physicists must rely on experimental measurements.

The advantage of lead–proton collisions is that when a proton smashes into a lead nucleus, it does not heat the nucleus up much. The collision can therefore be analysed to reveal important details about the parton distribution functions of the lead ions in their cold state.

First test 'successful'

The first test occurred on Monday and lasted 16 hours. CERN accelerator physicist John Jowett described it as "extremely successful". Jowett and colleagues first injected a few lead bunches in the presence of 304 proton bunches. A few bunches of each were then accelerated to the LHC's current full energy of 3.5 TeV for protons and 287 TeV for lead – or 1.38 TeV per lead nucleon.

At this top energy both beams are extremely relativistic. This means that with a small adjustment of the orbits, the protons and lead ions take the same time to complete one circuit of the accelerator. Initially, the lead and proton bunches met some 9 km away from the ATLAS experiment, but the team was able to move this meeting point back to the centre of ATLAS. This procedure lines up the bunches so they collide properly in all four experiments.

Jowett stresses that the beams were separated transversely so there have been no collisions yet. He added, "We still need to do some analysis of our data to determine whether the lead beam sizes were being blown up more than they usually are by other effects – that's important for projecting future performance." Further sessions on proton–lead collisions are planned at the LHC over the next four weeks, although most of the beam time in November will be devoted to lead–lead collisions.

روز فیزیک

دانشکده فیزیک دانشگاه تهران به منظور آشنایی هر چه بیشتر دانش آموزان با علم فیزیک، از سال ۱۳۸۶ تا ۱۳۸۹، اقدام به برگزاری همایشی با عنوان "روز بازدید" نمود. هدف از اجرای این همایش، تبیین علم فیزیک و رشته تحصیلی فیزیک برای دانش آموزان مقطع دبیرستان در شهر تهران بود. فعالیت های مشابهی در دیگر دانشکده های فیزیک کشور نیز برای گشودن درهای دانشکده و آزمایشگاه ها به روی عموم علاقه مندان تجربه شده است. با پیشنهاد اجرای این برنامه ها به طور همگام و همسان در سراسر کشور، انجمن فیزیک ایران، روز ۳ آذر ماه سال جاری را به عنوان "روز فیزیک" در سال تحصیلی ۹۱- ۱۳۹۰ برگزید تا بدین وسیله سایر دانشکده های فیزیک را نیز به این کار تشویق کند. امسال سه دانشگاه مطرح کشور، دانشگاه تهران، دانشگاه صنعتی شریف و دانشگاه صنعتی اصفهان برای برگزاری این رویداد، پیش قدم شدند.
دانش آموزان، دانشجویان رشته‌های علوم و مهندسی و علاقمندان به علم فیزیک، با مراجعه به سایت انجمن فیزیک ایران می توانند در این همایش ثبت نام نمایند. این همایش در سال جاری در دو شهر تهران و اصفهان برگزار می گردد. برنامه های این روز شامل: آشنایی با فیزیک، برخی از مسائل روز فیزیک، فیزیک سرا، آزمایشهای نمایشی ،‌ انجام آزمایش توسط شرکت کنندگان، ناهار و بازدید از آزمایشگاههای پژوهشی خواهد بود. با امید به اینکه با استقبال عمومی از این رویداد، در سال های آینده شاهد برگزاری پرشورتر این رویداد در دانشگاه های سراسر کشور باشیم. آخرین مهلت ثبت نام ۲۵ آبان ماه ۱۳۹۰ می باشد. دبیران فیزیک در تهران و اصفهان برای ثبت نام گروهی میتوانند با دفتر انجمن تماس بگیرند.

لینک ورود

Scientists still seek explanation for faster-than-light neutrino result

Scientists on the OPERA experiment announced last month that they had measured neutrinos traveling faster than the speed of light. Either this was the start of a revolution or the result of a systematic error. Being scientists, they assumed the latter and asked for help finding the glitch.

Despite what some headlines have suggested, the question of whether the result is correct is still up in the air. Experimentalists have not been able to establish how the experiment is flawed, and yet theorists have not been able to determine how its conclusion could be true. “There’s no model that explains it,” said CERN theorist Gian Giudice.

Scientists have posted dozens upon dozens of reactions to the OPERA result on the arXiv, an open-source archive of scientific papers, since scientists announced it on Sept. 23. But each proposed explanation contradicts other established measurements in particle physics. “Things have moved quite fast in these past few weeks,” Giudice said. “In my opinion, we’ve almost reached the point of saturation. The situation looks pretty grim.”

Finding the glitch

Experimentalists could ease theorists’ minds if they could find a problem with the original measurement. Plenty of people have tried, said Antonio Ereditato, spokesperson for the OPERA experiment. “After our seminar when we requested collaboration, we got some 700 emails,” he said.

People asked whether OPERA physicists had taken into account the rotation of the Earth, general relativity, continental drift and other factors that might affect their measurement. In response, the OPERA collaboration has explicitly calculated some of the effects they had originally argued would be negligible. Those trying to explain the OPERA result have quite a job to do in clearing up the difference between the measured speed of the neutrinos and the speed of light.

Neutrinos that the OPERA collaboration studied appeared to beat light by 60 nanoseconds traveling the 730 kilometers between CERN in Switzerland and Gran Sasso National Laboratory in Italy. “But light only takes 2.4 milliseconds to make this trip,” he said. “Imagine a faraway galaxy emitting light and neutrinos. Depending on the distance, you could have neutrinos arriving 10 or more years earlier than light.”

The new calculations did not cast doubt on the result, Ereditato said. “One actually increased our effect by 2 nanoseconds.”

OPERA will add these details to their scientific paper this month, addressing scientists’ queries but leaving open the question: If neutrinos cannot travel faster than light, what is causing this mistaken measurement? “If it were something obvious, it would’ve certainly come out during these weeks,” Giudice said. “I think the majority of physicists agree the job was done very carefully.”

A challenge from theory

One challenge to the validity of the OPERA result that has received media attention recently stems from a theoretical paper by physicists Andrew Cohen and Sheldon Glashow, a Nobel laureate. Cohen and Glashow wrote that, if a neutrino were to surpass the speed of light, it would emit pairs of electrons and positrons, thus losing energy during flight. We see a similar effect when particles are able to outpace light while traveling through water.

The OPERA experiment, in addition to a neighboring experiment, ICARUS, found several examples of neutrinos that managed to arrive at Gran Sasso with high levels of energy in tact, meaning that they had not lost electron-positron pairs as predicted during their journey. Scientists also did not detect stray electron-positron pairs coming from the traveling neutrinos. It seems the radiation Cohen and Glashow predicted did not occur.

If Cohen and Glashow are right, the neutrinos traveling from CERN to Gran Sasso did not beat the cosmic speed limit. However, their prediction has not been proven experimentally, and in science, experimental results trump theoretical predictions. The scientific community’s best option seems to be waiting for a second opinion from the MINOS neutrino experiment at Fermilab. MINOS physicists will take a similar measurement next year after making an upgrade to their detector in December. They may be able to collect enough data before a long shutdown next summer. But if not, the scientists at OPERA and the rest of the world might need to wait more than a year for an answer.

For now, the mystery remains, and the hunt for answers continues.

Asteroid has primordial core

The latest results from the Rosetta space probe reveal that asteroid 21 Lutetia might have a dense metal-rich core that formed at the very start of the solar system. The fact that such a primordial core lies beneath layers of rock challenges our understanding of what the solar system was like before the planets formed.

Rosetta was launched by the European Space Agency in 2004 and its final destination is the comet 67P/Churyumov–Gerasimenko in 2014. So far on its decade-long journey, it has also encountered two asteroids – 2867 Ṧteins and 21 Lutetia – in the main asteroid belt between Mars and Jupiter. Rosetta came to within 3200 km of 21 Lutetia in July 2010 and made detailed measurements of the asteroid. Today, astronomers have published three scientific papers based on those measurements of volume, mass and spectral features – with unexpected findings.

During the fly-by, 60 images from Rosetta's Optical, Spectroscopic and Infrared Remote Imaging System (OSIRIS) instrument were used to determine that the asteroid measures about 121 × 101 × 75 km, an overall volume only 5% different from that predicted by ground-based observations. "I was very surprised by how well the two techniques matched," explains Holger Sierks, from the Max Planck Institute for Solar System Research, Germany, and lead-author of one of the papers.

Feeling gravity's tug

A second paper reports on 21 Lutetia's mass, which is inferred from the gravitational influence the asteroid had on the approaching spacecraft. The velocity of Rosetta was altered by the asteroid's tug and this manifested itself in Doppler shifts in the radio signals it returned to Earth. After taking the gravitational influence of other solar system bodies into account, 21 Lutetia changed the frequency of Rosetta's signals by 36.2 mHz, which translates to a mass of 1.7 × 1018 kg.

Armed with the mass and volume of the asteroid, the researchers were able to calculate its density. What they found surprised them. "It turns out that 21 Lutetia is one of the densest known asteroids," explains Sierks. With a bulk density of 3.4 g/cm3, it is denser than most meteorite samples. Most previously observed asteroids vary in density between 1.2 and 2.7 g/cm3. This is because most are "Humpty-Dumpty" asteroids: those that have been smashed apart by collisions before slowly being put back together again by gravity. The gaps between the recombined rocks cause these asteroids to have low densities – but the Rosetta results suggest that 21 Lutetia cannot be such an asteroid.

Researchers also used Rosetta's Visible, Infrared and Thermal Imaging Spectrometer (VIRTIS) instrument to work out the asteroid's composition, reporting the results in the final paper of the trio. They concluded that 21 Lutetia's regolith – the layer of dust and soil that sits atop the underlying rock – shows similar thermal properties to the powder found on the Moon. Therefore, the asteroid's regolith is likely to have a similar density of about 1.3 g cm–3. This means that the interior of the asteroid must be even denser than the overall figure of 3.4 g cm–3. VIRTIS also failed to find signatures of metal minerals on 21 Lutetia's surface, which provides an important clue as to the origin of the asteroid.

Primordial core is intact

Sierks believes that the Rosetta results suggest that 21 Lutetia has a primordial origin. "A metal-rich core, formed just 1–2 million years after the formation of the solar system, would account for the high density and perhaps also explain why we don't see metals on the surface," he said. It would have to have formed that early in order for fast-decaying radioactive isotopes to keep the fledging asteroid molten, allowing the heaviest materials [metals] to sink toward the centre. "That would make 21 Lutetia a planetesimal [a building block of planets] and it would have initially been spherical," he added. Billions of years of collisions with other bodies would have slowly chiselled 21 Lutetia into the gnarled body that it is today, leaving its primordial core remaining intact.

However, not everyone agrees with the dense-core explanation. "The data are great, but the interpretation is flawed," warns Denton Ebel, meteorite researcher at the American Museum of Natural History in New York. "The inference of a metal-rich bulk planetesimal composition is a stretch," he says.

However, if true, Erik Asphaug, of University of California, Santa Cruz, thinks the finding still causes problems. "The concept of having a highly differentiated body, that is at the same time covered in rock, doesn't fit with our previous understanding of how the solar system was formed," he says. "It seems Lutetia violates some of the holy precepts of solar system origins."

The accelerating expansion of the Universe

The Nobel Prize in Physics 2011 honoured revolutionary, completely unexpected observations of the inflation of our Universe.

The Award was divided: One half was awarded to Saul Perlmutter and the other half jointly to Brian Schmidt and Adam Riess "for the discovery of the accelerating expansion of the Universe through observations of distant supernovae."

The discovery of the expansion of our Universe was already noticed by astronomer Hubble a long time ago in 1929, forcing Einstein to revise his famous equations about space, time and masses. So what made these new observations so special for the Scandinavian Award Committees? It was actually not the discovery that something inflates the Universe, but that it expands with an increasing speed.

The scientists' approach to evaluate this expansion was very smart; the science teams observed special types of so-called Ia supernovae - explosions of aged stars that are as heavy as our sun, but with a size of Earth. They did a great job to discover more than 50 distant supernovae and to register that their light intensity was surprisingly less than expected, drawing thereof the conclusion that the expansion of the Universe was accelerating. Einstein's formulas have been already revised a second time to cope with this new situation.

Two exciting, yet unsolved questions came up: What type of super force or super energy could be capable of pushing entire galaxies away from each other, revolting against the strong and far reaching gravitational forces of huge galactic mass clusters; and what stabilises these galaxies on top in a way that the outer stars move much faster on stable orbits than Newton's and Einstein's formulas allow?

Some years ago, scientists introduced the term "dark energy" to describe the accelerating expansion and the term "dark matter" to grasp the phenomenon of stable galaxies despite very fast moving remote stars. Dark energy and dark matter add up to an astonishing 96 per cent of total energies of our Universe, in case the new pictures are balanced against the earlier views of theoretical physicists and subsequent historical discoveries of astrophysicists.

This brings us right back to Einstein's imaginations of space and time, 100 years ago: Is it possible to enrich his formulations of space, time and masses for the third time to cover the new discoveries as well? Might this revision navigate science finally towards a first solution to combine Heisenberg's and Planck's quantum physics with Einstein's space-time continuum? The answer seems to be yes, because there is still one peculiarity in Einstein's formulations that has not yet been used in the reflections about an accelerating expansion of the Universe, extensively: Einstein's equating of length and time. Einstein introduced time as an equal fourth dimension to the three space dimensions length, width and height. Let us now theoretically suppose there are two or more Universes overlapping in a simple way that one spatial dimension coincides with Einstein's time dimension of all others, respectively.

The result is as astonishing as it is exciting, because what we get are flat, overlapping 2-D-spaces around us, utmost difficult to detect, since they have only two spatial dimensions. We could try to assign such flat spaces around us, for example to electromagnetism, in case electromagnetic waves turn out to be flat – and they are in fact flat, as everybody can prove it simply with horizontally and vertically polarised 3-D-glasses for 3-D-movies. These glasses use the flat 2-D-nature of these waves to filter light and to differentiate between information for the right eye and for the left eye. This example proves that such flat spaces around us in fact exist and that they become visible by electromagnetism, just generating turbulences on these coinciding dimensions.

What would happen if the coincidence of Einstein's time progress axis with one spatial dimension of these flat 2-D-spaces around us would get slowly lost? The answer is simple: Space of an observer expands with increasing speed at the expense of remaining time reserves for the future. From this point of view these flat spaces around us turn out to be one possible source of dark energy for the accelerating expansion of the Universe. Serial space points in time leap into simultaneous points in space. This is sort of a leakage from a potential future time span to space expansion towards a lower energy state, as the storage of events in time needs additional energy, just like battery charging. Finally, we could rotate two overlapping flat space dimensions further against each other, until they oppose each other's time progress and space dimensions. We can do this without conflict to any of Einstein's descriptions only if we introduce Planck's proven quantisation scheme for length minima and time minima. Below these Planck units time and length do not anymore appear as such. This way we derive a remarkable dark matter effect, accumulating as halos below undisputed Planck units and structuring together with dark energy NASA's confirmed 96 per cent of energy processes throughout the Universe.

One Clock With Two Times: When Quantum Mechanics Meets General Relativity

The unification of quantum mechanics and Einstein's general relativity is one of the most exciting and still open questions in modern physics. General relativity, the joint theory of gravity, space and time gives predictions that become clearly evident on a cosmic scale of stars and galaxies. Quantum effects, on the other hand, are fragile and are typically observed on small scales, e.g. when considering single particles and atoms. That is why it is very hard to test the interplay between quantum mechanics and general relativity.

Now theoretical physicists led by Časlav Brukner at the University of Vienna propose a novel experiment which can probe the overlap of the two theories. The focus of the work is to measure the general relativistic notion of time on a quantum scale.

Time in general relativity

One of the counterintuitive predictions of Einstein's general relativity is that gravity distorts the flow of time. The theory predicts that clocks tick slower near a massive body and tick faster the further they are away from the mass. This effect results in a so-called "twin paradox": if one twin moves out to live at a higher altitude, he will age faster than the other twin who remains on the ground. This effect has been precisely verified in classical experiments, but not in conjunction with quantum effects, which is the aim of the newly proposed experiment.

Quantum interference and complementarity

The Viennese group of researchers wants to exploit the extraordinary possibility that a single quantum particle can lose the classical property of having a well-defined position, or as phrased in quantum mechanical terms: it can be in a "superposition." This allows for wave-like effects, called interference, with a single particle. However, if the position of the particle is measured, or even if it can in principle be known, this effect is lost. In other words, it is not possible to observe interference and simultaneously know the position of the particle. Such a connection between information and interference is an example of quantum complementarity -- a principle proposed by Niels Bohr. The experimental proposal now published in "Nature Communications" combines this principle with the "twin paradox" of general relativity.

Einstein's "twin paradox" for a quantum "only child"

The team at the University of Vienna considers a single clock (any particle with evolving internal degrees of freedom such as spin) which is brought in a superposition of two locations -- one closer and one further away from the surface of Earth. According to general relativity, the clock ticks at different rates in the two locations, in the same way as the two twins would age differently. But since the time measured by the clock reveals the information on where the clock was located, the interference and the wave-nature of the clock is lost. "It is the twin paradox for a quantum 'only child', and it requires general relativity as well as quantum mechanics. Such an interplay between the two theories has never been probed in experiments yet" -- says Magdalena Zych, the lead author of the paper and member of the Vienna Doctoral Program CoQuS. It is therefore the first proposal for an experiment that allows testing the genuine general relativistic notion of time in conjunction with quantum complementarity.

لینک منبع