The Amelioration of Uncertainty

The most interesting thing you’ll hear about Fisher today

Nothing brings out the silliness in smart people like Quantum Mechanics; a subject I always associate with … R. A. Fisher. I confess to liking Fisher more than Bayesians should. Unlike the forgettable p-value conjurers I’ve known in person, Fisher’s writing portrays a thinking, creative scientist and feels oddly familiar. So what does Fisher have to do with Quantum Mechanics?

If you take the Lagrangian for Schrödinger’s Equation and replace the wave function with equation where equation is the probability distribution, you get:


where equation. Schrödinger’s Equation results from finding equation‘s extremum with respect to both equation and equation using the Euler-Lagrange equations. The first term is equal to Fisher’s Information for a location parameter,


While the quantity being averaged over is similar to the classical Hamilton-Jacobi equation:


Which makes the whole thing reminiscent of the constrained maxent procedure from Statistical Mechanics. It’s almost as though Fisher’s information was being extremized subject to a classical constraint. Note Fisher’s Information is a quadratic approximation to entropy and similar “information” terms appear in the lagrangians for the Pauli, Klein-Gordon, and Dirac equations as well.

Since Fisher’s Information is related to the Cramer-Rao bound, its not surprising it’s mathematically related to the Uncertainty Principle. In fact, Fishers’ Information can be used to derive an improved uncertainty relation,


All of this is incredibly suggestive, but unfortunately it’s gone nowhere. Both Fisher’s Information and Schrodinger’s Equation originated around the same time (1920′s) and many must have seen the connection, but no one’s been able to make anything out of it.

There is this work by B. Roy Frieden, which claims Fisher’s Information explains everything, but unfortunately this seems to be the work of a crank. I’m not sure about Frieden’s background, but he seems to have been a well respected Physicist who became familiar with Fisher’s information from applied work, possibly in image reconstruction, and noticed its presence in Quantum Mechanics. I don’t think he was a fully fledged Bayesian, but he seems to have run in Bayesian circles.

His attempt is obviously well meant, but I doubt this would have been published if he hadn’t already been an established scientist.

Anyway, Fisher’s Information may have nothing to do with the foundations of Quantum Mechanics, but it’s an idea worth keeping in mind.

September 25, 2013
  • September 25, 2013Daniel Lakeland

    You might like Edward Nelson’s work on “stochastic mechanics” and the foundations of QM. It takes these connections a little further. I just bought a festschrift for Edward Nelson: ISBN 0691125457 in which people take more or less a similar approach to what you’re talking about here.

    QM clearly involves a kind of diffusion, but it’s a diffusion with strong correlations over long distances. There’s been work on this all surrounding Ed Nelson, but it’s relatively obscure. Take a look if you have a little time.

    PS: Edward Nelson is also the originator of the IST version of nonstandard analysis that I’m very fond of, and wrote a great book on brownian motion available on his website:

  • September 26, 2013Brendon J. Brewer

    From your link to Gelman:

    “If you recall your college physics, you’ll realize that the results of the two-slit experiment violate the laws of joint probability”

    :-O Really?

    Ariel Caticha (SUNY) has done a lot of work on Fisher information and quantum mechanics from a Jaynesian point of view. I don’t understand it that well though so I can’t vouch for it.

  • September 26, 2013Corey

    Caticha’s approach seems to go like this: first, unify MaxEnt and Bayesian inference as minimization of the relative entropy of the posterior w.r.t. the prior. Second, conjecture that this minimization is not just analogous but identical to the process of making physical predictions based on minimizing functionals of state space configurations. Third, fall in love with the idea and base a research program on it. By design, this perspective replicates the success of standard approaches — which is as it should be. The hope (my hope, anyway) is that the fresh perspective proves its value by leading to powerful new approaches that provide results in unsolved problems. Caticha’s work doesn’t seem to me to display an abundance of such results (albeit I’m not really in any position to judge).

  • September 27, 2013Joseph


    Thanks for the Stochastic Mechanics references. It’s a very interesting read. I was up to date at some point on Stochastic Electrodynamics which was similarly being used as a possible foundation for QM.

    I really need to write this up in a number of posts, but my take is basically this: these efforts are being severely hampered by Stochastic calculus and the related ideas.

    Basically, stocahstic calculus is the unholy amalgam of probability distributions on paths and frequenitist intuition. The result is a cramped, difficult, but more importantly, highly limited version of the subject. If you take a bayesian view of probability distributions on paths you gets something simpler, more general, and considerably more flexible. See for example, the post “the law of trading edges” or even the comments from the last post.

    So while Stochastic Electrodynamics is a failure as a foundation for QM, I’m optimistic about the approach. They were able to get some good results using stochastic models/stochastic differential equations which is about like trying to do Quantum Mechanics using the “greek geometry” style of Calculus that Newton used.

    In short: I think there is a good chance it’s the mathematics which has failed in these attempts not the physical ideas. And to improve the math all you need to is interpret the probability distributions on paths in a truly Jayneisan way.

  • September 27, 2013Joseph


    Notice that mine was the third comment, so I’d like to take credit for predicting the silliness that can be found in the comments to Gelman’s post.

    If you always confuse probabilities and frequencies, always interpret inferences P(x|H) as “H causes x”, and don’t distinguishes between P(x|H1) and P(x|H2), you can tie yourself in knots so big you’ll never get out.

    One look at that discussion explains why physics hasn’t done anything in half a century or more. Physicists used to deride the other sciences as equivalent to “stamp collecting”. Yet during my entire lifetime the greatest advance in physics involved adding to their elementary particle stamp collection.

    This is not unique to physicists. Finance/Economist types never pass up an opportunity to get confused about probability distributions either.

  • September 27, 2013Joseph


    I’ve been looking at Caticha ever since Brendon first mentioned him and had exactly the same reaction. There’s a dead simple way to tell if there’s been successful Jaynesians style interpretation of QM: it blows the subject wide open and creates a host of new results/applications/experiments. Caticha is interesting though and I’ve been reading whatever I can get my hands on.

    In general, I lump this in with a wide class of attempts to operate with “information” as a primitive notion. Some of Arnold Zellner’s work in econometrics strikes me as being in the same vane although it’s superficially completely different. I have no problem with this type of thing in principle, and 50-70 years ago it would have seemed liked the way to go, but there’s been a lot of water under the bridge since then. In retrospect, attempts to treat “information” as an intuitive primitive used to derive principles-equations-methods is more or less a failure.

    You’ll notice that Jaynes himself was well aware of these efforts but didn’t really participate in them. I think there’s a good reason for that. He already understood what was happening in a simpler, more down to earth way.

    And really the whole thing is super simple. If you know equation and equation has been made as small as you can make it, then equation is a measure of your ignorance about equation. In practice this requires a slight generalization:


    Which is needed because sometimes we want to count over spaces other than the one equation explicitly mentioned. But that’s it. This idea is already so simple that attempts to separate “information” out into a primitive concept just confuses things.

  • September 27, 2013Daniel Lakeland

    I’m with you on Bayesian probabilities over paths. Towards the end of Gelman’s post there’s some discussion of DeBroglie/Bohm QM. I posted a link to some recent work on visualizing “pilot waves” using a hydrodynamic analog. I think Nelson’s IST makes it easy enough to reason about distributions over paths that a mere mortal can take the Bayesian approach and try to run with it. In particular, with the Bohm picture, the path uncertainty is induced by uncertainty in the initial conditions only, everything else is deterministic. I have heard that the Bohm picture is problematic for relativity, but just don’t know enough about it.

  • September 27, 2013Corey

    “Notice that mine was the third comment, so I’d like to take credit for predicting the silliness that can be found in the comments to Gelman’s post.”

    Well predicted, sir! You win a free subscription to my blog. ;-)

  • September 27, 2013Daniel Lakeland

    In particular, Nelson’s “radically elementary probability theory” shows that you can get all the results of stochastic calculus using discretizations on a finite but unlimited (nonstandard) grid in time and space. Doing the same thing lets you put a rigorous definition over Feynman path integrals as well. There’s a recent book along these lines that I also just bought: Frederik Herzberg, “Stochastic Calculus with Infinitesimals”.

    I think the distinction needs to be made between probability theory, and statistics. It’s fine to talk about probability theory, the pure mathematics of it, in terms of sequences of random numbers and distributions as urns etc. When it comes to applying that calculus to physical modeling, we need to distinguish between an interpretation in terms of Bayesian uncertainty, and an interpretation in terms of the frequencies with which physical events occur. I particularly like the idea of incorporating bayesian uncertainty in the initial conditions with deterministic Bohmian quantum mechanics. It just has the right feel, but I confess that I don’t know the implications under relativity. Nevertheless, the point that Bell makes, that we can’t just say we “measured” the system, since the measurement apparatus *is* a part of the system seems to me to be critically a missing component of what I know about the “usual” interpretation of QM.

  • September 27, 2013Joseph

    Stochastic Differential Equations or the Stochastic Calculus can really dazzle people. I remember loving it when I first encountered it in Finance, and loved how you can appropriate all that great Feynman Path integral stuff and convert them into classical solutions for use in finance or other settings. See here:

    But it really is pretty crappy and I suspect it will be viewed by later generations as an unfortunate mathematical cul-de-sac which held up progress in just about every field it was used heavily.

    Basically, whenever someone wants to do something like “Stochastic Mechanics” they reach for SDE’s because it’s a highly developed mathematical tool. In reality using it for Stochastic Mechanics is about like trying to do Quantum Mechanics using nothing more than greek style geometry demonstrations.

    SDE are extremely limited in this way because of the need of most people, including most Bayesians, to give a frequency interpretation to their probability mathematics. In essence, you’re only allowed to work with a tiny subclass of distributions on paths which somehow make sense from a frequency perspective and you’re only allowed to manipulate them in ways that make sense from a frequency perspective.

    But if you have no such need, then you can possibly bring in more powerful tools. Look at that Stochastic Mechanics closely: does it look like it’s the mathematical tools which aren’t powerful enough or does it look like it’s the physical ideas which are failing? To me it looks like it’s the mathematical tools that are letting people down.

  • September 27, 2013Joseph

    Also, Daniel the Bohm pilot wave stuff is great. Even if it turns out to be bunk (which is my guess), it still served a very useful purpose and clarified quite a few things.

    Bohm had all the street cred from the Aharonov–Bohm effect and his (Copenhagen interpreation) text book, which is a good thing since it’s unlikely anyone would have listened to his ideas otherwise.

  • September 27, 2013Daniel Lakeland

    I disagree a bit, I think the SDE stuff has some nice mathematical properties if you keep it simple enough so people stick to those simple ideas and this keeps them from getting close to real problems. If you add in the complications needed to model a real system, it just requires a ton of Monte Carlo and this turns off mathematicians.

    From the bayesian perspective, if you know that something is going on to perturb your system at many points in time, but you have no means to distinguish what is going on at different points in time, then exchangeability says you should model all those things as iid draws from *some* distribution. If you want continuity of the path, then you can prove that it *must* be a gaussian distribution. There are, however, all kinds of ways in which exchangeability can be broken, you can incorporate your time-local knowledge in a time-varying SDE. This is almost never done, in part because people just like the elegance of their simpler mathematical toy.

    Also, I think many SDE people think of the SDE as fundamental, rather than as some kind of approximation. If you put a rubber duck out in the ocean and you want to know where it will be a month from now, they say something like “well it has a continuous path, so it’s provably perturbed by white noise”…. ummm no, it’s perturbed by the weather you’re modeling the weather as random IID perturbations, and this model only maintains continuity of the path under gaussian white noise.

    But, you can get a decent bayesian prediction by taking your bayesian weather model and using it as the predictor for forcing that moves the rubber duck around. That’s not precisely an SDE. But if you’re planning to watch for a month, and you’re taking snapshots every hour, the net effect of an hour of weather may be not very different from the effect of the appropriately correlated gaussian process … and it may be easy to compute an hour of SDE perturbations, whereas it really sucks to compute an ensemble of a few hundred global weather models.

    it’s a modeling decision and when properly viewed as such it can be useful in some circumstances.

  • September 27, 2013Daniel Lakeland

    I should have said, “the net effect of an hour of weather may be not very different from the effect of the appropriately correlated gaussian process, and if the gaussian process is correlated on a short time scale compared to 1 hour, then it might as well be locally a white noise SDE”

  • September 27, 2013Daniel Lakeland

    I would say that the goal behind an SDE is to describe what we know about the path of a system using only local information and uncertainty about that local information. I think you’re right that there are plenty of times when that just isn’t enough, we need to incorporate more information about the path. like for example that the path is twice differentiable and its second derivative is bounded by a definite constant or something like that.

    For example, we have some particle falling through turbulent air. We don’t have any information about the air flow, except its effect on the particle so far (ie. we have a measurement of the path from t=0 to t=now). Perhaps the particle trajectory can be understood pretty well using 4 or 5 snapshots of the fall, plus a gaussian process with covariance function c(t1,t2) = exp(-((t1-t2)/s)^2/2).

    Now we make the turbulence more intense, and the mass of the particle smaller, and we can make s, the correlation timescale smaller…. Now we could be fans of IST and make s infinitesimal, and we’re going to get something like white noise. To the extent that the mass is small and the turbulence intense, the white noise approximation is going to work pretty well to tell us about the uncertainty in the path. Unfortunately, SDE guys sometimes seem to think this is the only case worth considering. I think this is in large part because they’re not working in IST or another nonstandard analysis, and so when they pass to the limit the white noise is kind of a mystical new object. It’s much less impressive if you take the nonstandard perspective.

  • September 27, 2013Corey

    Daniel, I paged through Charlie Geyer’s book on non-standard probability theory. Most of it looks familiar, but there were a few things in there that left me scratching my head…

  • September 28, 2013Daniel Lakeland

    Corey: I’m emailing you, reply if you want to talk about the “head scratching” portions of NSA/IST and etc. I am working on a paper on continuum mechanics in an NSA/IST framework and it would be really useful to discuss with someone.

  • September 30, 2013Corey


    One of Caticha’s earlier papers was a derivation of QM by considering the combination of particle experiments in series or in “parallel”. IIRC, he gets the existence of a Hamiltonian, linearity, and unitarity out of consistency requirements, a normalization assumption, and plus a bare, unmotivated postulate that the domain (and hence range) of the combination functions is the complex numbers.

    Now I also seem to recall hearing, during the recent PR push for the so-called “amplituhedron”, that linearity and unitarity don’t play nice with gravity, and this is a major stumbling block in the way of reconciling QM and GR. It seems to me that Caticha’s approach ought to work in a gravity-affected setting — not with the complex numbers, but with some other set or field or something — in such a way that QM can be recovered in the no-gravity limit. But I don’t currently know enough about the physics to guess if this is a reasonable way to try to go about it…

    What do you think?

  • September 30, 2013Daniel Lakeland

    Hey Corey, can you point to the article you’re referring to on ArXive or elsewhere?

  • September 30, 2013Corey

    The style of this blog doesn’t highlight links for some reason. The words “earlier papers” in my first sentence hide the link.

  • October 1, 2013Joseph


    When it comes to QM, or gravity, I don’t know what to think. What I do know is what I’m willing to spend time on and what I’m not. That’s a highly personal decision though. Here’s some points:

    -People have been trying to clarify the Quantum Muddle with these kinds of “derivations” for a long time. There must be thousands of these in the literature at this point, but the Quantum Muddle is still there.

    -Every one of these derivations I’ve seen has at least one strong, but in truth unmotivated, assumption which is doing the heavy lifting of recovering the equations of QM. This paper has at least one, as do attempts to see QM as an application of Fisher’s Information.

    -Having said that, I pay attention to these kinds of things when I run across them. If only for the partial insights they contain, interesting mathematics, or just raw creative ideas.

    -I don’t think a resolution of the Quantum Muddle will be found in the furthest branches of physics. That is to say, I don’t think we’ll get much out of a successful unification of QM and gravity or any other “frontier” topic like string theory. The rot in the tree of physics set in much closer to the roots.

    Or to put it another way, I think any resolution of the Quantum Muddle is likely to make the other branches of physics (classical mechanics, electrodynamics, statistical mechanics) appear very differently than they do now.

    -People forget how weird classical mechanics really is. We become accustom to it and a great deal about it is known mathematically, but that only superficially wipes away the weirdness. Take for example the S function in the classical Hamilton-Jacobi equation above. The classical action (or “actions” since there are several similar functions in classical mechanics which are easily confused) really is an odd beast.

    Every time physicists think they’ve understood it completely, some mathematician comes along and proves them wrong. So maybe one approach to the Quantum Muddle would be to clarify what that S is and why it might be in cahoots with a probability distribution.

    -I think Jaynes was absolutely right about QM being a confused muddle of ontological and epistemological components. I wouldn’t want to predict what happens to the ontological component, but I’m personally convinced the epistemological one can be made a good deal clearer and saner than it is now.

  • October 4, 2013Corey


    I’m deeply interested in hearing more about the mysteries of the S function. It seems to be inside-baseball enough that I haven’t encountered them before in my pop-physics readings. What sorts of things does “Every time physicists think they’ve understood it completely, some mathematician comes along and proves them wrong” refer to?

    On recovering QM, of course there has to be at least one strong and “unmotivated” assumption. (Actually I’d consider it motivated by the fact that we want QM out the end.) My perspective is that the name of the game is to reduce the strength of that assumption so that axioms (in the original sense of the word, intuitively obvious truths) like consistency are relied on as much as possible. This is desirable so that contradictions with extension of QM are minimized — we don’t want to have ruled out a correct extension through lack of imagination.

  • October 4, 2013Daniel Lakeland

    Corey, the Hamilton-Jacobi foundation for classical mechanics is inside-baseball enough that although I made analytical mechanics a particular area of study, and I’ve heard of it, I’ve never really followed up on it much.

    My interest has largely gone towards the Lagrangian formulation, and a little towards Gauss’ formulation. The main reason is that these formulations can deal in a relatively straightforward way with a separation between system and environment, ie applied forces or external non-holonomic work-applying constraints (like say robotic actuators and control circuits etc). Hamiltonian mechanics doesn’t work when you have someone external to the system monkeying with it ;-)

    A little about the S function as I’ve heard of it:

    Most of these formulations rely on a path-distinguishing-function. As a some-times computer scientist I don’t like the word “functional” or “function of a function” but the basic idea is that if you give a path through configuration space, S will give back a number for each point in the path. The Hamilton-Jacobi equations are conditions that make the integral of this path distinguishing function through time (ie the action) have some kind of special property, usually some minimization or stationary value. I think the H-J formulation’s benefits are largely that it has some straightforward interpretation in terms of curved geometry and plays nicely with relativity. It’s called a “generalization” of the formulation in terms of the calculus of variations. I’d actually love to get a really straightforward reference on it. The wikipedia article is terrible in my opinion.

  • October 4, 2013Joseph

    Corey and Daniel,

    By far the best place to look is this book:

    It’s just a well written, cheap and accessible book. Probably everyone who’s seen the mathematics has been amazed by the beauty of it. It’s not really a difficult subject.

    What I meant was that this stuff has been around for a couple of hundred years now and on several occasions it was thought we’d wrung all the insight from this mathematics there was to be had. But each time it turned out there was plenty of good stuff left to be discovered. I doubt we’ve discovered all of it currently.

    “Actually I’d consider it motivated by the fact that we want QM out the end”

    Exactly, that’s a better way to say it. The assumption is plausible ONLY because it allows us to recover QM. By the way, many of the best physicists I’ve worked closely with refuse to look at these kinds of papers. Some were very hostile to them. One I knew well as an undergraduate, who went to Cal Tech and then MIT, seemed unshakably ambivalent about them. He died recently and I doubt anyone will ever read anything he wrote again.

  • October 4, 2013Daniel Lakeland

    Joseph: I love that book by Lanczos, It was my constant companion for the first year of my PhD… but I don’t remember much about Hamilton-Jacobi specifically. I will take another look.

    I have a few links on my blog about how some recent papers have de-mystified non-holonomic constraints in lagrangian mechanics. That was just in the last few years. So you’re absolutely right about how it’s been overlooked far too much.

  • October 4, 2013Joseph


    The whole development in Lanczos is leading up to the Hamilton-Jacobi equation and it’s actually explained in two distinct ways. Physicist’s interest in the subject is directly related to QM and only rarely to Classical Mechanics by itself (although there have been some brilliant uses of it in mechanics over the years).

    Einstein’s re-imagining of the quantum conditions in the old quantum theory was directly stated in terms of S for example (see his 1917 paper on the subject).

    In physics grad school, Analytical Mechanics was synonymous with classical mechanics and continuum mechanics was viewed as a charming relic and almost entirely ignored. A second year graduate continuum mechanics course I took from the Engineering Mechanics department covered the theory side of the subject ridiculously better than anything I got from the physics department.

    I’ve come to have the exact opposite view of the true nature of mechanics over the years. Continuum Mechanics now appears to me as the true subject and Analytical Mechanics is really the study of the special differential equations which result from finding extremums.

  • October 5, 2013Daniel Lakeland

    Well, continuum mechanics is more or less where all the action is in the use of Newtonian / Non-quantum mechanics to actually get real world stuff done. Sure there’s some interesting stuff in granular materials where continua are problematic and maybe discrete element simulations are the only way we can move forward, and there’s nano-scale stuff where it breaks down too, but if you want to do something practical like predict fracture, you should look no further than a recent continuum mechanics approach called Peridynamics for example.

  • October 15, 2013ishi

    I came across this blog via googling. Its quite interesting—the issues in probability, econ, and physics overlap my own, but there are so many views at this point I don’t know what to say, nor maybe ever will.

    I will just mention that while you dismiss Frieden (like many others, though not all), i was interested when i came across his stuff in Phys Rev E (deducing all lagrangians in physics from fisher info). I saw Streater’s and Shalizi’s reviews of Frieden. But, interestingly, Frieden has many publications in the last few years in PLOS, physics journals and so on, some co-authored with well known people who are not seen as ‘cranks’. (I can mention one of his papers in PLOSone on cancer is referenced by the physicist Paul Davies, also in a PLOS paper, who is well known (and possibly both are in Arizona, and I am familiar with at least 4 others in that system who seem to walk the border between rigor and wack). Davies I think now has an NIH grant, but the cancer community seems him as wack.)

    So, i guess my interest is this ‘he said, she said’ situation. Maybe the idea is ‘if it sells, then who cares’. Or, ‘multiverse’ or paraconsistancy — everyone is right, and we are all liars. Or maximum entropy.

    Eric Nelson is an interesting character. I think he is also an ultrafinitist, and a hardcore christian, and not long ago had a proof that ZFC set theory was inconsistant, though he retracted it. Supposedly he has some paper on the double slit experiment in QM explaining it classically (and there are 100′s of such attempts) but my impression is he has decided (as I more or less did when i first looked at his stochastic mechanics approach to quantum theory) his approach doesnt explain anything (nor does Bohm’s) except possibly provide a different interpretation.

    I will add that your post on analogies between thermodynamics and economics might be disputed by some (many papers on this in the last 10 years) though you do say in a statistical, rather than strict sense, the formalism can be applied (eg since conservation laws dont hold exactly in econ). The ‘he said, she said’ dynamic also holds in this area. (of course econ is known for this—are markets efficient, or not, etc.)

    A meta-analyses of these controversies might be interesting.

  • October 15, 2013Joseph

    I had no idea Shalizi reviewed Frieden’s book! I read it:

    I agree with everything Shalizi says, except I think he understates how bad Frieden really is. I really don’t think he would have been given such respectful a hearing on this if he wasn’t an already respected Physicist.

    The overwhelming sense I got last time I looked at his book was that Frieden could mangle just about any equation into producing a Fisher’s Information term, the same way some people see Jesus’s face in moldy walls, potato chips, ice cream swirls and everywhere else.

    My best guess is that Fisher’s Information fundamentally has nothing to do with QM, and that the equations work out that way for other reasons. Although, since it is there you can use some mathematical intuition derived from Fisher’s Information in statistics as a tool to help understand QM a little better. But that’s pretty weak tea in truth.

  • December 29, 2013Corey

    Late (either a year late, or a couple of months late, depending on how you count) breaking news: John Baez noticed an analogy between the Boltzmann distribution in statistical mechanics and Feynmann’s path integral approach to quantum mechanics about a year ago. He saw immediately that this implied that Feynmann’s path integral must be a stationary point of some functional (of a complex function, hence no optimization) subject to some constraints; he named that functional “quantropy”. He developed the analogy between it into a paper that is now on the arXiv (

  • December 29, 2013Corey

    Oopses: Somehow Boltzmann’s name generated a second trailing “n” in Feynman’s name. In “He developed the analogy between it”, the last two words are relics of cut-n-paste editing and should have been deleted.

  • December 30, 2013Joseph


    Interesting but Baez didn’t notice this analogy. He even says it is “well known”. I would say its obvious enough that most physicist probably recognize it even before it’s pointed out to them.

    By the way, the third Wilson to win a Nobel Prize in Physics got it basically for mathematically exploiting this analogy. Nobel Prizes in Physics are almost a birthright for Wilsons.

  • December 30, 2013Corey

    One day, I tell you, I will be aware of all physics traditions.

    I was thinking today of the way that entropy arises as an asymptotic approximation of the multinomial coefficient in the Wallis argument (link) for the maximum entropy principle. I was trying to imagine something like that approach that would give rise to a principle of stationary quantropy, but I couldn’t generate any insight — I just don’t have the background. Oh well, back to seeing how useful SEV can be.

Leave a Reply or trackback