Showing posts with label physics. Show all posts
Showing posts with label physics. Show all posts

Wednesday, February 15, 2023

Supersymmetry vs QCD

Inspiring discussion - Dr. Peter Woit finds a declining trend in the number of articles on Supersymmetry (SUSY) since about 2015. See here. 

But what meaning can we draw from the absolute number?  I find it meaningful to compare with some other subject in particle physics.  I think "QCD" (quantum chromodynamics, the strong force) is a relatively stable subject of study, and it is about something real, unlike SUSY, which has proven to be quite speculative.

So here it is.  The data had to be collected by hand, so there might be transcription errors.

But the downward trend in SUSY relative to QCD begins around 1999.  Maybe particle physics is a bit healthier than expected.





Monday, October 24, 2022

On Dark Matter

Some quotes from "New Directions in the Search for Dark Matter",(https://arxiv.org/abs/2204.03085 by Surjeet Rajendran, John Hopkins University.  

The paper is a good backgrounder on how we might find out what dark matter is comprised of; but there is also a philosophy of physics that has largely been forgotten in all the stringy revolutions.

The existence of dark matter proves that there is physics beyond the standard model. But, other than its existence, observational limits on its properties are extremely weak.

....

Given the vastness of this parameter space, how can we hope to make progress? When confronted with this vastness, there is a human tendency to artificially restrict it by focusing on “theoretically well motivated” dark matter - in this context, “theoretically well motivated” means particles that theorists have already written down for some other reason. While it is certainly possible that the existence of dark matter may be tied to the solution to some other problem in particle physics, such a connection is not a logical requirement. It is a fantasy to think that the particle spectrum of the world can be figured out entirely from first principles. I have not come across a physicist who has convinced me that their refined sense of theoretical insight would have allowed them to figure out (without experimental input) that the Standard Model is a SU (3) × SU (2) × (1) gauge theory with the SU (3) confined at low energies, the SU (2) × (1) broken in a weird way leaving an unbroken (1), with three generations of quarks and leptons that have hierarchial yukawa couplings with only the top quark possessing a naturally large yukawa coupling while also containing nearly massless neutrinos and a highly fine tuned Higgs boson. Our job as physicists is to discover what nature actually is rather than attempt to constrain it from the armchair.

...

A skeptical reader may ask if we should actually care about technical naturalness. After all, we now have very solid evidence of at least two fine tuned quantities in our universe - the cosmological constant and the higgs boson itself. Neither of these terms are protected by symmetry and the absence of symmetry did not prevent their existence, creating confounding theoretical problems. Our job as physicists is to figure out what is out there in the world instead of imposing philosophies on it - especially philosophies that are already empirically known to be violated.

.... 

The identification of the nature of dark matter is pretty clearly one of the major problems confronting particle physics. It is exceedingly unlikely that humanity will solve this problem from the armchair by guessing a sufficiently pretty theory. Physics is an experimental field - the belief that we can figure out what is out there in the world without experimental input has always just been a silly fantasy. Given the vastness of the parameter space of dark matter, there is a tremendous need to dramatically widen the experimental program that has been pursued to detect its properties. Now, it could have been the case that this dramatic widening could only come at great cost - if every probe of a part of dark matter parameter space required billions of dollars and thousands of working hours, we will not be able to appreciably probe the dark matter parameter space in our lifetimes. Luckily, this is not the case - the methods and experiments described in these lectures are experiments that can be pursued by a small number of investigators at the cost of several million dollars per experiment. It is thus possible to sustain a robust ecosystem of dark matter experiments which will cover a significant range of parameter space. While the creation of such a program is not up to me, I certainly hope that this broad ranged program will come to be realized.


Friday, April 09, 2021

Bhabha scattering and the anomalous magnetic dipole moment of the muon

Just noting some observations here, likely of no significance at all. 

  •  Bhabha scattering is the electron-positron scattering process. 

  •  Per Wiki
Electron-positron colliders operating in the region of the low-lying hadronic resonances (about 1 GeV to 10 GeV), such as the Beijing Electron Synchrotron (BES) and the Belle and BaBar "B-factory" experiments, use large-angle Bhabha scattering as a luminosity monitor. To achieve the desired precision at the 0.1% level, the experimental measurements must be compared to a theoretical calculation including next-to-leading-order radiative corrections. The high-precision measurement of the total hadronic cross section at these low energies is a crucial input into the theoretical calculation of the anomalous magnetic dipole moment of the muon, which is used to constrain supersymmetry and other models of physics beyond the Standard Model.

  • One would think Bhabha scattering is extremely well understood in terms of theoretical calculations.  So I was surprised to find this paper from 2020

    Patrick Janot, Stanisław Jadach,
    Improved Bhabha cross section at LEP and the number of light neutrino species,
    Physics Letters B, Volume 803, 2020, 135319, ISSN 0370-2693, https://doi.org/10.1016/j.physletb.2020.135319. (https://www.sciencedirect.com/science/article/pii/S0370269320301234)

    Abstract: In e+e− collisions, the integrated luminosity is generally measured from the rate of low-angle Bhabha interactions e+e−→e+e−. In the published LEP results, the inferred theoretical uncertainty of ±0.061% on the predicted rate is significantly larger than the reported experimental uncertainties. We present an updated and more accurate prediction of the Bhabha cross section in this letter, which is found to reduce the Bhabha cross section by about 0.048%, and its uncertainty to ±0.037%. When accounted for, these changes modify the number of light neutrino species (and its accuracy), as determined from the LEP measurement of the hadronic cross section at the Z peak, to Nν=2.9963±0.0074. The 20-years-old 2σ tension with the Standard Model is gone. 

  • A discussion of the recent muon result is on Peter Woit's blog.  Some of the comments under that blog post are of interest.
Presumably the large-angle Bhabha scattering used to calibrate the newer experiments is already much more accurate from the get-go.

        

Sunday, November 29, 2020

Machine Learning and Physics

Machine Learning has in use in the Large Hadron Collider for long enough that there is now a Coursera online course about it. Basically, machine learning is used to help handle the approximately one petabye per second of data collected from particle collisions. That is one kind of use of machine learning.

 Neural networks do the job of recognizing the content of images and video, and speech recognition much better than the traditional type of computer algorithms that people can write, so it is absolutely the right technique to give computers the senses of vision and hearing. The detection of "cat" or "utility" pole from two-dimensional arrays of bytes, which is computer vision, is generalizable to "seeing" patterns in data sets in N-dimensional arrays. This "data pattern sense" is a sense organ humans lack. Neural networks can help provide humans this sixth sense. 

 Beyond that, neural networks have to connect up with some type of symbolic representation, in order to be able to handle concepts, even simple relations like "bigger than", "smaller than", "behind", "above", etc.. The seventh lecture, Neurosymbolic AI, in the MIT introduction to deep learning, 6.S191 is from where I learned about it back in February, and a regret this year is that I have not been able to follow up on it. The idea is something like this (words added to clips from David Cox's slides):


I believe that the computer will have to connect what it can sense with its "data pattern sense" to symbolic representations, and then, what the computer can do will be no better or worse in its performance than whatever automated reasoning can do today, for instance, in mathematics theorem provers. So, if mathematics falls to Artificial Intelligence, then physics may follow, but not otherwise.

Tuesday, April 07, 2020

On Emergent Phenomena

Bee has a blog post: "What is emergence? What does “emergent” mean?"  For whatever reason, 99% of my comments simply don't go through, so discussion over there seems impossible.  My thoughts on the subject are newly formed, and I'm putting them down here, so that I can get unstuck, and not because these ideas are right or have any merit.  So this post will likely be revised or even deleted.

This from Bee is as good a description of "emergent" as you can find:
Something is emergent if it comes about from the collective behavior of many constituents of a system, be that people or atoms. If something is emergent, it does not even make sense to speak about it for individual elements of the system.
There are a lot of quantities in physics which are emergent. Think for example of conductivity. Conductivity is the ability of a system to transport currents from one end to another. It’s a property of materials. But it does not make sense to speak of the conductivity of a single electron. It’s the same for viscosity, elasticity, even something as seemingly simple as the color of a material. Color is not a property you find if you take apart a painting into elementary particles. It comes from the band structure of molecules. It’s an emergent property.
It is in the discussion of weak and strong emergence that I drift away.  I think I get stuck on the "can be/cannot be derived".
Weak emergence means that the emergent property can be derived from the properties of the system’s constituents and the interactions between the constituents.....In physics the only type of emergence we have is weak emergence. With strong emergence philosophers refer to the hypothetical possibility that a system with many constituents displays a novel behavior which cannot be derived from the properties and the interactions of the constituents. While this is logically possible, there is not a single known example for this in the real world.
(Perhaps it is because I'm stuck on the notion of derivation as is done in mathematical logic.)

Consider the Second Law of Thermodynamics.  Pick any set of microscopic laws - the Standard Model of particle physics; or the Standard Model modified in any way, or one of the 10^500 universes of superstring theory.   Or make heat a fluid (Lavoisier's "caloric" was the context in which Carnot did his work).  The Second Law remains true in all these cases.   While as students of physics, we are indoctrinated with statistical mechanics as underlying the Second Law of Thermodynamics, the Law actually arises from very general considerations, with no assumptions at all about the microscopic physics, i.e., from the mathematical properties of Pfaffians and some mapping of mathematical concepts into physical concepts.  (Zemansky's 1965 Kelvin and Caratheodory--A Reconciliation indicates what I'm talking about).

The way I see it, the Second Law is true if heat is an invisible fluid and it is also true if matter is made of atoms and heat is simply the energy of random molecular motion.  So in what sense can the Second Law of Thermodynamics be said to derived from the properties of the system's constituents?   The Second Law is not only an emergent law, in some sense of the term "strongly emergent", it is so. It is a law that is true no matter what the underlying microscopic physics is, and thus can be "derived" from some axiom set including axiom A with some mapping of mathematical to physical concepts, but can equally well be derived from some other axiom set including the axiom (not A) with some other mapping of mathematical to physical concepts.

In this regard, I'm not sure the concepts of "weak emergence" and "strong emergence" are particularly useful.   An example is the never-ending debate of whether the phenomenon "consciousness"  is strongly emergent, or is explainable ultimately in terms of the brain and its cells.  Let's imagine humans, electronic/compute devices, Fred Hoyle's solar-system sized cloud all exhibit "consciousness".  Taking the Second Law of Thermodynamics as our exemplar, and assuming that a mathematical description of "consciousness" is feasible,  one has to concede the possibility that such a description is rather independent of the microscopic details.   If that turns out to be true, then the same problematic situation (at least to me) arises - how can such a description be said to be "derived" from the properties and interactions of the constituents of the conscious system?

The chemistry of hydrogen, carbon, etc., are a particular way because of the properties of their constituents, and would be different if e.g., the electron/proton mass ratio was different, or if the radius of the proton measured electron Compton wavelengths was much larger.  Derivation of the chemistry crucially depends on these properties.   That is one kind of emergence.   I'd place all the things described by the Wilsonian renormalization group in this category too.

A second kind of emergence is where the behavior of the system can be described independent of its constituents, e.g., as with the Second Law of Thermodynamics.    These are perhaps two useful types of "emergence", especially if we can find more laws of physics like the Second Law.






Wednesday, February 13, 2019

Wincing at the state of physics

High energy particle physics theorists - the people who lead the investigations into the most fundamental aspects of nature - have been having a bad streak.  They have had no major successful new idea in the last forty years, all the theories they've proposed -- and there are lots of them -- have failed to yield an experimental signature.  The excuses for failure are flying thick.

Mathematician Peter Woit at Columbia University in his blog, recently quoted physicist Nima Arkani-Hamed of the Institute of Advanced Studies, Princeton, from an answer given in the Q&A session after a seminar, thus:
You could very justifiably say “look, you’re just continuing to make excuses for a paradigm that failed”, OK, and I would say that’s true, and even the paradigm most of your advisors love [e.g. usual SUSY] was already an excuse for the failure of non-supersymmetric GUTs before that.
That is a perfectly decent attitude to take, but I would like to at least tell you that you should study some of the history of physics. This very, very, very rarely happens, that some idea that seems basically right is just crap and wrong, It’s probably mostly right with a tweak or some reinterpretation. You’d have to go back over…, I don’t know how far you’d have to go back, even Ptolemy wasn’t so far from wrong
Ouch! That is some seriously bad reading of the history of physics.   That was amply pointed out in the comments on the blog.

Rutgers University physicist  Amitabh Lath rightly, I think, pointed out:
Stop picking on Nima. You all are doing the internet thing of taking one statement in an hour talk and ganging up.
But challenged on it, he continued:
Even the pre-Copernican Ptolemaic stuff made sense to me. Basically, there are concepts in a failed theory that you might want to keep (things moving in circles around other things) and others you might want to jettison. Granted, he is not very good at history of science.
 Double Ouch!  Is "things moving in circles around other things" a valuable idea from the Ptolemaic theory of the solar system?  I think it is wrong on two counts -- firstly, the idea of things moving in circles around other things is present in earlier theories of the heavens; and secondly, things moving in circles around other things is not a theoretical idea; it is a root observation, the experience early humans had of the skies, at the very start of the study of the heavens.   For example, the sun rises at the east horizon in the morning every day, sets at the west horizon in the evening  and presumably somehow finds its way back in the dark to the east horizon the next morning, completing a closed loop if not a circle, per the early earth-bound humans.

I'm happy that I'm not in the high energy physics milieu at all.  Lot of mathematics, very little understanding.

Sunday, August 28, 2016

When will SUSY be wrong?

High energy particle physics theorists are disappointed and even dismayed that the Large Hadron Collider has shown up nothing beyond the Standard Model Higgs.  Their favorite "Beyond Standard Model" physics, based on an idea called supersymmetry (SUSY) has not shown even a tiny hint of existing.  Over on "Not Even Wrong", Peter Woit asked "“Is there any forseeable experimental data that would cause you to decide that SUSY was an idea that should be abandoned?”

Urs Schreiber gave a logical answer as to why physicists might think SUSY is relevant to physics - not just in a technical sense where it can make some computations tractable - but as a part of reality.  You can follow the link (or see below the fold).

So, I thought that Woit's question was answered.  Nothing but a mathematical theorem with a proof will serve to end the SUSY quest.  Namely, something like:
"Nature chose to have an ordinary group act on the supermanifold" because:

1. Using a supergroup on the supermanifold implies a necessary feature in the low energy theory that our observed low energy world lacks;  or

2. Our low energy world has an observed feature that the use of a supergroup on the supermanifold cannot reproduce; or else,

3. Using a supergroup on a supermanifold produces a high-energy theory that fails for some reason (without even considering the low energy world).
Since the question is: “Is there any foreseeable experimental data that would cause you to decide that SUSY was an idea that should be abandoned?”", (3.) above need not concern us here.  The Higgs as detected by the LHC, with nothing super- accompanying it, does not quite fall into either 1. or 2. without additional assumptions.

Therefore the SUSY search will continue.
Woit didn't like that answer and deleted it.

On a side note, Charles Darwin, around the time of spelling out his theory of evolution, also wrote: "It is mere rubbish thinking at present of the origin of life-- one might as well think of the origin of matter."  This is because he knew that the problem of the origin of life was well beyond the reach of the science of his time.  Particle physics theorists, however, have believed for about forty years that a complete description of fundamental physics is within their grasp.  Nature has proven to be rather elusive.

Wednesday, May 04, 2016

mathpages.com

Stumbled across this yesterday - http://mathpages.com/ - looks like a lot of interesting reading - mostly mathematics, and some physics.  I came across this via a trail that began at a classic - The Art of Unix Programming.

Thursday, March 10, 2016

Particle Physics to-do list from 2005

Chris Quigg:  (the stuff in italics has answers today)


Thursday, February 18, 2016

The Unreasonable Effectiveness of Physical Insight

Physicist Lee Smolin has an interesting pre-print in the History and Philosophy of Physics section of arxiv.org : "Lessons from Einstein's 1915 discovery of general relativity".  ( If one has read books at the level of Penrose's popular works, this paper should be quite comprehensible.)

Smolin argues, that contrary to the myth created by Einstein himself, it was not beautiful mathematics that led Einstein to his theory of general relativity.  Historians of physics have gone through Einstein's notebooks, and what they shared with Smolin led him to this:



I had a very happy day about fifteen years ago when I visited Jurgen Renn in Berlin and he showed me images of the notebooks in which Einstein had created general relativity. What impressed me was that Einstein was using the same techniques all physicists use to grasp the essential features of a phenomena they want to model. These are the development of approximate expressions, together with theplayful creation of simple examples and models. These are the tools every physicist is taught, which they employ throughout their career, first, to do their homework and, later, to make progress in their research.

The mathematics Einstein used may appear beautiful to some who study it, but what is going on in Einstein’s notebooks was not beautiful. It was hardheaded and pragmatic.  When you dine at a fancy restaurant you may be impressed by the aesthetic presentation of a dish as it is brought to the table. But this is only the last step, just as the freshness of the ingredients as they come from the farm is only the first step. In between, hidden in the kitchen, it is all just hard, practical work. Mistakes are made, but these, ideally, never leave the kitchen. In Einstein’s kitchen—his notebooks it was no different.

It was Einstein's physical insight and intuition that gave him his greatest successes, and when he had no new insights and relied on the beauty of mathematics, as in his futile quest for a unified theory, he got absolutely nowhere. 

Smolin draws the lesson:
The lesson is that the task of formulating a physical principle must come first—only when we have one in hand do we have a basis to look for new mathematics to express the new principle.

The physical principles "that Einstein invented such as the principle of equivalence and the principle of the relativity of inertial frames"..."are directly about nature."

They constrain, and can be falsified by, individual experiments. They require no mathematics to express them: their contents can be entirely captured in a verbal description of an experiment. Historians talk of “thought experiments”, but in fact the principles invented by the young Einstein referred to genuinely doable experiments.

All very good, but then Smolin goes on to propose that background independence ("...the laws of nature should be statable in a form that does not rely on the specification of a fixed geometry of spacetime") might be one such physical principle. Here, I'm lost. This seems to me to be a statement the real content of which can only be expressed in mathematics. I'm hard-pressed to think of genuinely doable experiments that test this principle.

Smolin weighs in on the holography principle: "This says that a model world with gravity can be described as if it were a world without gravity, with one fewer dimension, where that surface theory has one degree of freedom per Planck area", and says that this principle does not have the physical content of the principles of relativity and equivalence, and cannot be tested in single experiment.

So color me puzzled. But it suits my particular inclination that physics proceeds with physical insight, not with mathematics. Of course, if we're stuck with no unexplained anomalous observations or experimental results, and we have no good physical principle, then we can only pay attention to the mathematics, and hope that it leads to something. The history of the last forty years of particle physics is that this is a slender straw to cling to.

Tuesday, June 02, 2015

Physics - A New Theory to Explain the Higgs Mass

Via a comment by David Metzler on Peter Woit's blog - this article explains a new proposal that explains the mass of the Higgs particle.

Here is the arxiv.org pre-print: http://arxiv.org/abs/1504.07551.
Cosmological Relaxation of the Electroweak Scale
A new class of solutions to the electroweak hierarchy problem is presented that does not require either weak scale dynamics or anthropics. Dynamical evolution during the early universe drives the Higgs mass to a value much smaller than the cutoff. The simplest model has the particle content of the standard model plus a QCD axion and an inflation sector. The highest cutoff achieved in any technically natural model is 10^8 GeV. 
In all the years since I crashed out of theoretical particle physics, I have not come across any work that I wish I had done. Of course, that may be due to my ignorance.  This, to me, is a strong candidate for such a work.  That too, may be due to my ignorance.   Ignorance is bliss, isn't it?

So I should explain why I think this paper is important.  I think the article at the first link explains to a non-physicist the problem that this paper solves about as well as is possible (until someone like Sabine Hossenfelder decides to write about it, something which we should all devoutly hope for.)

Let's just say that the Standard Model of particle physics has a problem, and the orthodoxy for the past many decades has been to try to solve it by tacking on additional particles and even things such as additional dimensions of space.   This paper solves the problem - provides an ansatz may be more accurate - without adding any such things. Its particular models may ultimately not be viable;  but it has broken the logjam; it is a demonstration that the huge zoo of postulated additional particles and such constructs of the theorists is not necessary to solve the problem, and thus is a good corrective to the last 30-40 years of mainstream particle physics theory.

As far as I can tell, there is nothing in this paper's content that could not have been figured out twenty years ago.  Perhaps it is the salutary shock of finding absolutely no trace at the cutting edge of experiments at the Large Hadron Collider (LHC) of the theorists' burgeoning menagerie that enabled the mental break with the orthodoxy.  And thus it should be, physics is an experimental science; it is most certainly not mathematics.

Thursday, March 22, 2012

The Perils of the Smartness Obsession


Paul Frampton, distinguished professor of physics, age 68, was in the news. He was arrested in Buenos Aires in January on the charges of attempting to smuggle two kilograms of cocaine out of the country.

Professor Frampton was likely the victim of a scam. Like the New Zealander Sharon Mae Armstrong, he was lured to Argentina by the other end of an internet romance.  He never met the "model" but was persuaded by an associate to carry a bag on her behalf.  The bag had cocaine in a hidden compartment.
Sharon Armstrong had five kilograms of cocaine in her luggage.

As a physicist blog put it,
Everyone passing through international airports will know that they must pack their own bags and be responsible for the contents. Travellers are continually warned and asked about it. It is easy to be befriended especially in honeypot traps. The details of how Frampton may have been tricked are not yet known but similar stories are well-known. Cases have even been turned into films such as Bangkok Hilton. It will be hard for an intelligent professor to persuade his prosecutors that he was naive enough to innocently accept to use a suitcase with cocaine stuffed into the padding. We wish him luck.
So far naivete and perhaps stupidity have been on display - why my headline? Well, someone drew attention to a section in this preprint by Frampton which I have reproduced after the fold.

Tuesday, March 22, 2011

2011 Isaac Asimov Memorial Debate: The Theory of Everything

Monday, September 13, 2010

Interpretation of Quantum Mechanics

I spent some time this weekend, listening to about five of the Perimeter Institute's recorded lectures on the Foundations and Interpretation of Quantum Theory, available here. It was heavy going and I'm not sure I've learned a lot.

Let me just mention the problem. Quantum mechanics has an impeccable mathematical formalism, and a simple mathematical rule to relate what you calculate using the machinery with the outcomes of measurements. If you're satisfied with this then you belong to the "Shut Up and Calculate" school of physicists. What the formalism has trouble with is relating the mathematical formalism to our ordinary intuitions of the world. It is difficult even to explain how the illusion of what we perceive arises.

At this point, I refer you to Wiki.

Monday, July 26, 2010

Physics Olympiad

Indian girl (but headed to MIT) wins a gold.

Friday, February 12, 2010

QOTD

From a comment on Peter Woit's blog A physics professor at a small department about undergraduate students:
And they seem completely uninterested in such topics [General Relativity, Quantum Field Theory]; they seem to view gauged field theories, not as the theoretical triumph that they or, not as a rich field of ongoing investigation, but as a pesky inconvenience on their way to becoming the next Einstein. There’s no real intellectual curiosity; they just want to be the next Einstein.

Thursday, January 14, 2010

QOTD

The (futile?) quest for new physics:
Unfortunately, the measured rates are all in excellent agreement with standard model predictions.