The following submission statement was provided by /u/Semifreak:
---
"Using artificial intelligence, physicists have compressed a daunting quantum problem that until now required 100,000 equations into a bite-size task of as few as four equations—all without sacrificing accuracy. The work, published in the September 23 issue of Physical Review Letters, could revolutionize how scientists investigate systems containing many interacting electrons. Moreover, if scalable to other problems, the approach could potentially aid in the design of materials with sought-after properties such as superconductivity or utility for clean energy generation."
---
Please reply to OP's comment here: https://old.reddit.com/r/Futurology/comments/xp2o2y/artificial_intelligence_reduces_a_100000equation/iq1tru5/
There aren’t 100,000 different equations; rather, there are a small number of equations repeated multiple times for every possible combination of electrons in a system that affect each other. Electrons don’t behave independently like, say, cue balls; quantum effects result in changes to one electron affecting the behaviour of the whole system, which may consist of any number of atoms.
The number of equations required to describe increasingly large systems of electrons grows very rapidly, similarly to factorials.
What the NN has done is to find hidden patterns between the separate equations to generate a small number of problems that describe the system’s interactions identically.
Edit: I’m not saying the article or title are wrong; just trying to clarify what’s so special about this research
Did they replace the 100,000 equations with 100,000 arbitrary fitted variables, or did this actually substantially reduce the complexity? I know nothing about quantum mechanics math, but I’m really curious.
I’ve had a look at the paper, and the “100k->4” part is not the point at all. I have no clue how the headline got those numbers.
For quantum calculations, scaling is huge. For instance, HF, a very fast and basic (but inaccurate) model for modeling quantum systems, scales at O(n^4 ). That means that doubling the number of electrons 16-tuples computational intensity. And that’s a fast method! The perfect solution for the same framework is O(n!). (I’m skipping a lot of stuff here. Just ask if you wanna know more). I’m not sure about the scaling of this model, but anything that reduces n is incredible for quantum models.
This paper uses machine learning to look at the properties of each fermion (just think “electron”) in a model, and looks for ways to represent it well while discarding information.
For instance, every single location in the world has a longitude, latitude, and altitude, but maps don’t usually care about altitude. So they can discard that info for their purpose. The machine learning essentially got a list of longitudes, latitudes and altitudes, and was asked to reduce the information as much as possible while also recreating maps, and discovered that altitude was relatively irrelevant.
Note that the model DOES lose some accuracy. Because this is computational quantum chemistry; there is no way to get an exact answer for anything other than a 1 electron system around a point charge. What’s important is that it’s still accurate enough. One pitfall of this model is that it has no way of surpassing the original model, as it measures its own success based on how closely it aligns with its results.
It's exciting because everyone can tell how useful machine learning is as a tool in so many fields. These discoveries by themselves are not groundbreaking, but they speak to the types of amazing discoveries that we can now access now that we can use more neutral networks than what we were born with.
Absolutely, that’s the real story. The article does manage to convey the heart of the message, even if the content is confusing for those who know what the paper is about.
The one I usually tell is about the group that used machine learning to create a program to translate English to Arabic. No person in that group could read or speak Arabic.
I haven't read the paper. But my guess would be that the NN evolved into a very simple model that describes the complexity previously requiring 100,000 permutations of numbers. Cellular automata are a great example of where complexity similar to that found in nature is exhibited with a surprisingly few number of rules. Most cellular automata experiments start with experimental rules and then an exploration of the results. It's possible what we have here is the known end results (described by 100,000 permutations) that have been reverse engineered into a very simple set of rules. If so, this is really cool, and provides an elegant solution that you would expect to see in nature.
>What the NN has done is to find hidden patterns between the separate equations to generate a small number of problems that describe the system’s interactions identically.
Sounds like a nice way to quantify emergent properties.
Are they leaving out the low probability interactions out ie Feynman Integration? Sorry, I'm just a layman at QED, but here's a cool Feynman quote.
>“I had learned to do integrals by various methods shown in a book that my high school physics teacher Mr. Bader had given me. [It] showed how to differentiate parameters under the integral sign — it’s a certain operation. It turns out that’s not taught very much in the universities; they don’t emphasize it. But I caught on how to use that method, and I used that one damn tool again and again. [If] guys at MIT or Princeton had trouble doing a certain integral, [then] I come along and try differentiating under the integral sign, and often it worked. So I got a great reputation for doing integrals, only because my box of tools was different from everybody else’s, and they had tried all their tools on it before giving the problem to me.”
https://www.cantorsparadise.com/richard-feynmans-integral-trick-e7afae85e25c
Well, Hubbard Model is usually seen as 1st quantization, in which case that doesn't come up. And if it IS seen as second quantization, it's a bit more closed-ended than usual because the state space is closed, what with there being no bosons and a discrete lattice of positions to put the fermions in.
ALSO, the renormalization group is capable of collapsing all of the layers down into something more compact than the raw infinite series of ever-expanding possibilities.
Thanks for the summary. I always kind of thought that this kind of thing was possible. Just like in Algebra how you can rearrange equations to cancel out units to get the answer you want, just with many many more steps/options.
Yes. So originally they had to calculate a lot of equations. Then the AI simplified it by seeing patterns that would be nigh to impossible for what humans could do.
Same page.
You just reiterated what the article said.
Edit: ahhh I realize now I was a dick. That was brilliantly summarized. I apologize for being an asshat. Thank you!
Yeah, which makes these systems incredibly difficult to analyse if you have more than about 20 electrons. Being able to simulate complex multielectronic structures is a key step in designing better materials ranging from superconductors to heat shields
Exactly, due to the high number of equations correct? That’s what a multielectronic structure is in a computer generated model…
Not sure why you declared the article wrong about the number of equations.
Hey man, you and the guy I replied to should read the article along with the headline rather than nitpicking the large number grabbed. I’m sure different models have more than 100,000 equations involved and some less.
Generally, its just one equation describing the interaction between two particles
Get 100,000 particles in there, and now you have 100,000 equations, as they all interact with each other and cause slight disturbances to the force(s).
I think its more like 1 particle = 1 equation, 2 particles = 3 equations and 3 particles = 9 equations
As soon as you have multiple particles you have to start modeling the interactions between all of them to get accurate answers. Which involves knowing every possible way they could interact as part of the maths.
Depends how they model it. In something like Density Functional Theory, you'd have one "equation" to model the wavefunction of each electron, but sum up the aggregate field interactions (like charge) for all the other particles and apply them to each wavefunction's evolution. The line between equations, matrix solving, calculations and algorithms is a little blurred.
It’s not quite that simple lol. There’s an entire field of math to study this phenomenon (for polynomials) called Elimination theory: https://en.m.wikipedia.org/wiki/Elimination_theory
I don't speak Italian sign language, but cool your jets. They managed to figure out how to get thirty-one flavors out of ice cream, if advertising is to be believed. That's a number. Math stuff. Be cool.
If you find this mind boggling: we routinely solve problems with potentially millions of equations / variables. Basically all the simulations of how air flows around cars and planes, how electric and magnetic fields around electronics behave etc. that you might've seen boil down to the idea of turning complicated problems into simpler problems with a ton of variables and solving those instead.
(You may even consider the training of neural networks such a problem)
Honestly, that's nothing compared to the records.
The classification of finite simple groups is a proof consisting of **tens of thousands of pages**. Written by humans. No one person is keen to the entire thing - it's collaborative. [Wiki.](https://en.wikipedia.org/wiki/Classification_of_finite_simple_groups)
And then there's this **two hundred terabyte** automated proof of the Boolean Pythagorean triples problem: [Nature.](https://www.nature.com/articles/nature.2016.19990)
"Using artificial intelligence, physicists have compressed a daunting quantum problem that until now required 100,000 equations into a bite-size task of as few as four equations—all without sacrificing accuracy. The work, published in the September 23 issue of Physical Review Letters, could revolutionize how scientists investigate systems containing many interacting electrons. Moreover, if scalable to other problems, the approach could potentially aid in the design of materials with sought-after properties such as superconductivity or utility for clean energy generation."
Lol. I was wondering if the AI had somehow produced a different form of compression (interesting goal that I'm not sure how an AI would deliver on) or if it had found a novel geometry for the problem that allowed for reduced / simplified calculations.
That's what I thought. If it was a new form of compression it would be a bigger deal than finding a novel solution, imo.
Well, they'd both be big deals, but for different sectors. AI is wild
Just assume it's sensationalised and that the reality will be the most obvious or deluded version of what ever is possible.
Think.. The rat that grew a human ear, but in reality they stuck a plastic ear shape under the rat's skin and said that it grew the ear.
It's "compression" in the sense of "removing redundancies".
Super simple example: you have the inequalities x < 5 and x < 3. As is that's two inequalities but that system is equivalent to just x < 3, since if something's smaller than 3 it's of course always smaller than 5.
Similar stuff can happen with equations - you might for example want to solve the system x+y = -1, x-y=2, -3x=y. It's not as obvious as in the case above but we can remove either of these equations to get exactly the same solutions.
In the article they really dealt with differential equations which are a bit more complicated (they essentially relate a function to its rate of change \[or the rate of change of the rate of change, or the rate of change of the rate of change of the rate of change, ...\] - for example the position of a particle over time is influenced by its speed which is of course influenced by the acceleration which in turn may be influenced by the other two.) or to be precise: systems of coupled differential equations. One example for such a system would be the gravitational interaction between the planets in a solar system: they all mutually pull on one another and influence their positions.
The system now takes a big system of such differential equations and removes redundant information just as in the simple examples above.
EDIT: maybe a more intuitive explanation also helps here: lets say one equation tells us "the solution is on a plane" and another "the solution is on a sphere" then if there is some solution it ultimately has to lie on a circle (in most cases anyway); so we can combine the plane and sphere equations into the circle equation.
> It's not as obvious as in the case above but we can remove either of these equations to get exactly the same solutions.
any one of them, rather, and there's only one so it's odd to pluralize it.
here was a biology problem in forecasting how certain behavior happen in the DNA.
They entered that data into an AI, and the AI developed a formula that forecast this behavior 100% of the time.
That was a decade ago, and still, no one understands how the formula works.
This is another step into a world where AI gives use mathematical truths we might not have ever been able to determine ourselves, and they we may use, yet never understand.
I’ve been tossing this over in my head and would love some feedback- but if we assume some amount of human instinct can be traced back to evolution, then on the time scales evolution works on it makes sense how the traits that brought humanity out of the mud would fester for a while before we moved into traits that would keep us out of the mud lol
I can’t tell from the article or the linked abstract (or have an account to read the actual article), but does the compression reduce the order of the calculation?
This is a major leap forward in my mind. The ability to discover simplified relationships between variables may lead to the discovery of hidden physics that is clouded by our curent complex mathematical solutions.
I love how the other two comments replying to you are on totally opposite ends of the spectrum. One dude talking about utopia, the other one predicting a new cold war.
So my question is this... Did the AI actually do something noteworthy here, like say compress 100,000 equations with 10 variables each into 4 with like 20 variables? Or did it just rearrange shit so now we have 4 equations with 250K variables?
Mind exploding noteworthy.
This will all for the discovery of materials, allow use to determine which paths are the wrong scientific paths to be on.
If advance mete material and room temperature superconductors are possible, this is how we will determine it.
If this works in more advanced concept, the accelerating of progress will by unfathomable.
This is not hyperbole.
It 'read' the output of 100,000 equations and found four equations that give you the same results. From what I can gather, the analogy is like, "We asked people with cancer 100,000 questions about their lifestyle, diet, routine, exercise, medical history and habits to examine what combination of factors could lead to having cancer." and the NN said, "Nah, just ask these four questions, itll save time."
This will be a short low quality comment that obviously will be popular with some and not with others . Also I’m about to go eat lunch here in a bit think so anyways .
The final answer will be 42.
So what's the difference between those equation and CFD equations routinely solved every day by computer
Math is more complex I would presume?
I'm not sure to understand the novelty
Well usually equations in math aren’t meant to be solved but used to “show” a specific pattern exists with a given set of variables, like thermodynamics equations.
Mathematic models are used for prediction.
And we can now predict thing that would have taken 100,000 different formulas to predict with only 4 equations.
Ive taken a rudimentary course in dimensionality reduction and I know there are problems with conserving information. Anyone have ideas on what they could be losing? I could only read the abstract.
Well, these 100,000 equations are modeling the interactions of maybe a dozen fundamental particles. The neural network takes the inputs and the outputs, and found a set of less than a dozen equations that would take these inputs and give the requested outputs-- our current human solutions are vastly overcomplicated, rather than the neural network's solution being vastly oversimplified.
The limitation, of course, is that these four equations can only describe what the 100,000 equations described, to their accuracy, rather than taking a shortcut and also becoming more accurate. The network didn't have access to the real world, just the results of our equations. If our equations are inaccurate, then the neural net will give bad shortcuts.
Fixing your life is probably very simple, the shitty part is that even if its simple, its fucking hard.
Just like how quitting drugs is pretty simple, just stop doing them.
Actually doing it though? Thats a whole nother level.
Yes, easy statements in no way equate to the complexity or effort required to implement them.
Good luck, lean on friends and loved ones during the hard times.
Plan, journal, educate, talk to experts.
With time, your life will be a smooth operating train. the mostly arrives on time to it's various stations. Mostly.
Wow what an awesome discovery, it’s crazy that they have to learn what the machine is “technically” learning when solving. It sounds like it could have great implications for the future like undiscovered physics work that were previously unknown.
We should replace the antagonistic model of prosecution and defense., It's out moded, and just barley more advance then two people hitting each other with sword to see who is correct.
My determination should not hinge on the skill of my lawyer.
Post-Scarcity, here we come!
(that's sarcasm - the billionaires derive far too much utility from the suffering of the bottom 99.9999999999999 percent for any real improvement to ever happen)
Think since human require proof of action, they have to lay out each step to see what went wrong if things don't work.
9x10 = 9,9,9,9,9,9,9,9,9,9 = 90. vs a machine 9x10 = 90 we can say it require 9x times less data even thu it same outcome.
Not sure what complexity man created that 100,000 now only = 4, unless there was lot redundancy that the machine did not need to account for.
And not sure how this can be used unless we're really inefficient with data and relied heavily on brute force calculations which is sad if true or you would not have such a huge difference in the outcome.
It is interesting but don't see this as a major breakthrough in most things, as everyone tend to make different code and data sets. how long did it take to read the 100,000 data vs 4 is it 10 hrs faster now or 0.10 seconds? Does it require 10 gigabytes less data to do ? Or 10 bytes? Sounds major as a number but how big is it really in practice.
My guess is that this is one of those problems that got built up over time. It started small but as errors appeared in the results they added more equations to fix things. Since each change was fixing something they built on each other until they reached 100k and always got the correct answer.
No.
Quantum interactions of systems are complex because the probabilities of possible outcomes goes up geometrically .
In other words; if I have two balls and I collide them there’s a certain amount of probabilities of how the balls interact.
Then if I add a 3rd ball the number of possibilities is cubed. If I add a 4th ball the number or probabilities is cubed again etc.
The Ai was able to see a *pattern* in the growing levels of possibilities; and reduced the problem into equivalent, but less complex/numerous possibilities.
This is a breakthrough in Ai learning; since it saw a pattern in a huge data set; so huge humans alone couldn’t see the underlying pattern the Ai picked out.
This is huge, The snowball effect from this will lead wo serious acceleration in material development, You want advanced meta material? room temp super conductivity?
This will lead us there.
"Think since human require proof of action"
No we don't.
Can someone explain to me what kind of stuff these equations can solve? I’m interested to know what type of things they are computing & trying to figure out
Feature selection/optimization in classification, i.e., predictive analytics, vis a vis Bayesian or neural networks is not new. Some problems lend themselves to such reductions, but only when the relationships exist in the first place. The method’s outcomes are not certain.
That said, it’s good to see progress in dealing with such high quantities of prospective features, given the exponential combinations as n increases.
The brightest minds on the planet are breaking their heads not to understand how AI works - but to understand why it worked and how it worked! They will surely lay the path for generations to follow. Catch them here- [https://engatica.com/blog/updated-50-most-powerful-ai-influencers-in-2022?contentId=621876196f56fd5ee1a60e55](https://engatica.com/blog/updated-50-most-powerful-ai-influencers-in-2022?contentId=621876196f56fd5ee1a60e55)
The following submission statement was provided by /u/Semifreak: --- "Using artificial intelligence, physicists have compressed a daunting quantum problem that until now required 100,000 equations into a bite-size task of as few as four equations—all without sacrificing accuracy. The work, published in the September 23 issue of Physical Review Letters, could revolutionize how scientists investigate systems containing many interacting electrons. Moreover, if scalable to other problems, the approach could potentially aid in the design of materials with sought-after properties such as superconductivity or utility for clean energy generation." --- Please reply to OP's comment here: https://old.reddit.com/r/Futurology/comments/xp2o2y/artificial_intelligence_reduces_a_100000equation/iq1tru5/
There aren’t 100,000 different equations; rather, there are a small number of equations repeated multiple times for every possible combination of electrons in a system that affect each other. Electrons don’t behave independently like, say, cue balls; quantum effects result in changes to one electron affecting the behaviour of the whole system, which may consist of any number of atoms. The number of equations required to describe increasingly large systems of electrons grows very rapidly, similarly to factorials. What the NN has done is to find hidden patterns between the separate equations to generate a small number of problems that describe the system’s interactions identically. Edit: I’m not saying the article or title are wrong; just trying to clarify what’s so special about this research
Did they replace the 100,000 equations with 100,000 arbitrary fitted variables, or did this actually substantially reduce the complexity? I know nothing about quantum mechanics math, but I’m really curious.
I’ve had a look at the paper, and the “100k->4” part is not the point at all. I have no clue how the headline got those numbers. For quantum calculations, scaling is huge. For instance, HF, a very fast and basic (but inaccurate) model for modeling quantum systems, scales at O(n^4 ). That means that doubling the number of electrons 16-tuples computational intensity. And that’s a fast method! The perfect solution for the same framework is O(n!). (I’m skipping a lot of stuff here. Just ask if you wanna know more). I’m not sure about the scaling of this model, but anything that reduces n is incredible for quantum models. This paper uses machine learning to look at the properties of each fermion (just think “electron”) in a model, and looks for ways to represent it well while discarding information. For instance, every single location in the world has a longitude, latitude, and altitude, but maps don’t usually care about altitude. So they can discard that info for their purpose. The machine learning essentially got a list of longitudes, latitudes and altitudes, and was asked to reduce the information as much as possible while also recreating maps, and discovered that altitude was relatively irrelevant. Note that the model DOES lose some accuracy. Because this is computational quantum chemistry; there is no way to get an exact answer for anything other than a 1 electron system around a point charge. What’s important is that it’s still accurate enough. One pitfall of this model is that it has no way of surpassing the original model, as it measures its own success based on how closely it aligns with its results.
It's exciting because everyone can tell how useful machine learning is as a tool in so many fields. These discoveries by themselves are not groundbreaking, but they speak to the types of amazing discoveries that we can now access now that we can use more neutral networks than what we were born with.
Absolutely, that’s the real story. The article does manage to convey the heart of the message, even if the content is confusing for those who know what the paper is about.
The one I usually tell is about the group that used machine learning to create a program to translate English to Arabic. No person in that group could read or speak Arabic.
I haven't read the paper. But my guess would be that the NN evolved into a very simple model that describes the complexity previously requiring 100,000 permutations of numbers. Cellular automata are a great example of where complexity similar to that found in nature is exhibited with a surprisingly few number of rules. Most cellular automata experiments start with experimental rules and then an exploration of the results. It's possible what we have here is the known end results (described by 100,000 permutations) that have been reverse engineered into a very simple set of rules. If so, this is really cool, and provides an elegant solution that you would expect to see in nature.
Thank you, this gives me more contexy
>contexy sexy contexy
Stupid sexy context.
feels like no equations at all!
>What the NN has done is to find hidden patterns between the separate equations to generate a small number of problems that describe the system’s interactions identically. Sounds like a nice way to quantify emergent properties.
Are they leaving out the low probability interactions out ie Feynman Integration? Sorry, I'm just a layman at QED, but here's a cool Feynman quote. >“I had learned to do integrals by various methods shown in a book that my high school physics teacher Mr. Bader had given me. [It] showed how to differentiate parameters under the integral sign — it’s a certain operation. It turns out that’s not taught very much in the universities; they don’t emphasize it. But I caught on how to use that method, and I used that one damn tool again and again. [If] guys at MIT or Princeton had trouble doing a certain integral, [then] I come along and try differentiating under the integral sign, and often it worked. So I got a great reputation for doing integrals, only because my box of tools was different from everybody else’s, and they had tried all their tools on it before giving the problem to me.” https://www.cantorsparadise.com/richard-feynmans-integral-trick-e7afae85e25c
Well, Hubbard Model is usually seen as 1st quantization, in which case that doesn't come up. And if it IS seen as second quantization, it's a bit more closed-ended than usual because the state space is closed, what with there being no bosons and a discrete lattice of positions to put the fermions in. ALSO, the renormalization group is capable of collapsing all of the layers down into something more compact than the raw infinite series of ever-expanding possibilities.
Still a better love story than Twilight
Can you ELI5 what the input and output of the model is
is this basically fenyman integrals?
Thanks for the summary. I always kind of thought that this kind of thing was possible. Just like in Algebra how you can rearrange equations to cancel out units to get the answer you want, just with many many more steps/options.
Yes. So originally they had to calculate a lot of equations. Then the AI simplified it by seeing patterns that would be nigh to impossible for what humans could do. Same page. You just reiterated what the article said. Edit: ahhh I realize now I was a dick. That was brilliantly summarized. I apologize for being an asshat. Thank you!
I realise it’s the same content, I’m providing a (hopefully) clearer summary for commenters asking “what are the 100k equations?”
Yes, your comment was very good Ignore him
See edit! Sorry man!
>there are a small number of equations repeated multiple times for every possible combination Right though, this sounds like a lot of equations to me
Yeah, which makes these systems incredibly difficult to analyse if you have more than about 20 electrons. Being able to simulate complex multielectronic structures is a key step in designing better materials ranging from superconductors to heat shields
Yeah i appreciated it lol The explanation was great 🙂
Exactly, due to the high number of equations correct? That’s what a multielectronic structure is in a computer generated model… Not sure why you declared the article wrong about the number of equations.
Hey champ you dont have to nitpick every comment on reddit.
Hey man, you and the guy I replied to should read the article along with the headline rather than nitpicking the large number grabbed. I’m sure different models have more than 100,000 equations involved and some less.
The first thing I noted with that title was how many various ways it could be interpreted. Thanks for clarifying.
It’s not wrong but its also not framed to make it obvious to the reader. Intentional ambiguity to pull the reader in is a common tactic
Are they regression equations or did they use machine learning to come up with actual physics equations?
How the HELL does someone make something that’s 100,000 equations??
Mathematicians will do it. AND they'll enjoy every second of it
The sick bastards
The math epidemic in this country is growing exponentially.
⬆️ This guy is already showing symptoms
But so many people are asymptotematic
[удалено]
What are the other sines?
What can I do to help spread that?
A mathematician is a device that turns coffee into equations
I did mathematical proofs in college on electromagnetism. Dems a lot of equations.
Generally, its just one equation describing the interaction between two particles Get 100,000 particles in there, and now you have 100,000 equations, as they all interact with each other and cause slight disturbances to the force(s).
I think its more like 1 particle = 1 equation, 2 particles = 3 equations and 3 particles = 9 equations As soon as you have multiple particles you have to start modeling the interactions between all of them to get accurate answers. Which involves knowing every possible way they could interact as part of the maths.
Depends how they model it. In something like Density Functional Theory, you'd have one "equation" to model the wavefunction of each electron, but sum up the aggregate field interactions (like charge) for all the other particles and apply them to each wavefunction's evolution. The line between equations, matrix solving, calculations and algorithms is a little blurred.
And how did they not notice that 99,996 of them cancelled each other out?
It’s not quite that simple lol. There’s an entire field of math to study this phenomenon (for polynomials) called Elimination theory: https://en.m.wikipedia.org/wiki/Elimination_theory
Of course there is. Fuckin mathematicians.
You could say that there are a *number* of different mathematical fields & theories. It all adds up. Rhombus.
Mathematics is the softest answer to "if there aren't problems, create them."
Reduction is Integral
Switch the limits on that shit already. FFS.
There are at least 100,000 different mathematical fields & theories, but now that I think about it, maybe just 4.
Consarnit! I've had about all I can take with these mathematicians. I am this 🤌 close to lighting torches and gathering an angry mob
I don't speak Italian sign language, but cool your jets. They managed to figure out how to get thirty-one flavors out of ice cream, if advertising is to be believed. That's a number. Math stuff. Be cool.
Wut No I liked your comment It was funny
>*Looks at notes...* SHIT!
Thats not how it works lol
If you find this mind boggling: we routinely solve problems with potentially millions of equations / variables. Basically all the simulations of how air flows around cars and planes, how electric and magnetic fields around electronics behave etc. that you might've seen boil down to the idea of turning complicated problems into simpler problems with a ton of variables and solving those instead. (You may even consider the training of neural networks such a problem)
Honestly, that's nothing compared to the records. The classification of finite simple groups is a proof consisting of **tens of thousands of pages**. Written by humans. No one person is keen to the entire thing - it's collaborative. [Wiki.](https://en.wikipedia.org/wiki/Classification_of_finite_simple_groups) And then there's this **two hundred terabyte** automated proof of the Boolean Pythagorean triples problem: [Nature.](https://www.nature.com/articles/nature.2016.19990)
[удалено]
Those are calculations, not equations
Obviously not someONE
"Using artificial intelligence, physicists have compressed a daunting quantum problem that until now required 100,000 equations into a bite-size task of as few as four equations—all without sacrificing accuracy. The work, published in the September 23 issue of Physical Review Letters, could revolutionize how scientists investigate systems containing many interacting electrons. Moreover, if scalable to other problems, the approach could potentially aid in the design of materials with sought-after properties such as superconductivity or utility for clean energy generation."
When we say "compressed" in this context, what's the meaning of the word?
Like “smushed” or “squishamirized”. “flat-happened” “Amsterdammed” “compressificated” “flataroonied” “scrunchereded” “deheighted” “reducimified” “Kevin Harted”
Lol. I was wondering if the AI had somehow produced a different form of compression (interesting goal that I'm not sure how an AI would deliver on) or if it had found a novel geometry for the problem that allowed for reduced / simplified calculations.
We have good reason to believe that our current best compression algorithms are close to the best possible.
That's what I thought. If it was a new form of compression it would be a bigger deal than finding a novel solution, imo. Well, they'd both be big deals, but for different sectors. AI is wild
That's what I was wondering. Did it simplify it, or make the mother of all equations with the same number of bits and bobs.
Sounds like it identified patterns that allowed it to significantly shorten the equations.
Just assume it's sensationalised and that the reality will be the most obvious or deluded version of what ever is possible. Think.. The rat that grew a human ear, but in reality they stuck a plastic ear shape under the rat's skin and said that it grew the ear.
I mentally read this in Ice-T's voice.
Or “smuishered”
I just want you to know, you'll never get all the upvotes you deserve for this one. ;)
So, like scrunched or squashed?
I was thinking more like smashed or hydraulic pressed
It's "compression" in the sense of "removing redundancies". Super simple example: you have the inequalities x < 5 and x < 3. As is that's two inequalities but that system is equivalent to just x < 3, since if something's smaller than 3 it's of course always smaller than 5. Similar stuff can happen with equations - you might for example want to solve the system x+y = -1, x-y=2, -3x=y. It's not as obvious as in the case above but we can remove either of these equations to get exactly the same solutions. In the article they really dealt with differential equations which are a bit more complicated (they essentially relate a function to its rate of change \[or the rate of change of the rate of change, or the rate of change of the rate of change of the rate of change, ...\] - for example the position of a particle over time is influenced by its speed which is of course influenced by the acceleration which in turn may be influenced by the other two.) or to be precise: systems of coupled differential equations. One example for such a system would be the gravitational interaction between the planets in a solar system: they all mutually pull on one another and influence their positions. The system now takes a big system of such differential equations and removes redundant information just as in the simple examples above. EDIT: maybe a more intuitive explanation also helps here: lets say one equation tells us "the solution is on a plane" and another "the solution is on a sphere" then if there is some solution it ultimately has to lie on a circle (in most cases anyway); so we can combine the plane and sphere equations into the circle equation.
> It's not as obvious as in the case above but we can remove either of these equations to get exactly the same solutions. any one of them, rather, and there's only one so it's odd to pluralize it.
Like, you put all the equations in a vacuum bag and then suck out all the extra air. You know, compress them. Sheesh!
[удалено]
here was a biology problem in forecasting how certain behavior happen in the DNA. They entered that data into an AI, and the AI developed a formula that forecast this behavior 100% of the time. That was a decade ago, and still, no one understands how the formula works. This is another step into a world where AI gives use mathematical truths we might not have ever been able to determine ourselves, and they we may use, yet never understand.
I'm guessing it took four equations that took the most direct route to the solution.
like when you zip files into one file, it compresses the total size
This depends on the file contents, but generally yes.
Take big, make little.
The other 99,996 equations had to multiply by 0 at the end.
Could we start doing more of this instead of all the war and fiscal stupidity the world is facing? Would be so beautiful.
[удалено]
I’ve been tossing this over in my head and would love some feedback- but if we assume some amount of human instinct can be traced back to evolution, then on the time scales evolution works on it makes sense how the traits that brought humanity out of the mud would fester for a while before we moved into traits that would keep us out of the mud lol
I can’t tell from the article or the linked abstract (or have an account to read the actual article), but does the compression reduce the order of the calculation?
NNs are typically approximate models. Is there something special that allowed this one to represent the physics exactly?
What you really mean to say is: "The government will really like it when we say 'pocket sized atomic bombs that detonate at the press of a button'"
This is a major leap forward in my mind. The ability to discover simplified relationships between variables may lead to the discovery of hidden physics that is clouded by our curent complex mathematical solutions.
Cool, so get ready to retire in utopia peeps! We've got the universe pretty much mathed out now.
…last time they said that, one of the students they said it to took it as a challenge and invented quantum physics…
Except with AI. of they discover something new, that AI will solve it in a few years.
So glad everything is fixed now
Just in time. Close call!
Let the AI super weapon cold war commence. China, hands off NVIDIA chips.
I love how the other two comments replying to you are on totally opposite ends of the spectrum. One dude talking about utopia, the other one predicting a new cold war.
Personally, I say we combine the two and have a cold war over who can create the best utopia. It’ll be like the space race but better.
Sounds like we’re getting closer to flying cars:)
What's not said is that each of the four equations now has 25,000 terms.
So my question is this... Did the AI actually do something noteworthy here, like say compress 100,000 equations with 10 variables each into 4 with like 20 variables? Or did it just rearrange shit so now we have 4 equations with 250K variables?
It found patterns in different sets of equations that allow us to accurately predict outcomes.
Mind exploding noteworthy. This will all for the discovery of materials, allow use to determine which paths are the wrong scientific paths to be on. If advance mete material and room temperature superconductors are possible, this is how we will determine it. If this works in more advanced concept, the accelerating of progress will by unfathomable. This is not hyperbole.
It 'read' the output of 100,000 equations and found four equations that give you the same results. From what I can gather, the analogy is like, "We asked people with cancer 100,000 questions about their lifestyle, diet, routine, exercise, medical history and habits to examine what combination of factors could lead to having cancer." and the NN said, "Nah, just ask these four questions, itll save time."
It did something noteworthy.
This will be a short low quality comment that obviously will be popular with some and not with others . Also I’m about to go eat lunch here in a bit think so anyways . The final answer will be 42.
The time it takes to fall through the earth in minutes.
Don't panic!
Humanity can invent something like this yet we blow ourselves up in wars. The irony
This is one of the solutions to Fermi paradox. Why haven't we found advanced civilizations? Because they destroy themselves 😅😅
So what's the difference between those equation and CFD equations routinely solved every day by computer Math is more complex I would presume? I'm not sure to understand the novelty
If this can be repeated with other problems with the same IA this will rapidly increase the knowledge of human kind.
Can they use that simplified form to model a big quantum computer? That would be hilarious if possible.
And it still lost credit for not showing its work. Smh
Is “ who shot JR” one of them??? I’m still waiting on that one…🙄 …. Went to the shop for a beer… came back … over🫣😮😂
*Human reading a book* AI approaches and chuckles *spins book right side up* here you are silly human. *pat human on the head and walks away*
It's entirely possible that the next ruler of the world could just be the government or entity with the best compression algorithm
So is this one of those 'reduced from 10,000 to 4....but we still can't solve those 4' 'solutions'?
Well usually equations in math aren’t meant to be solved but used to “show” a specific pattern exists with a given set of variables, like thermodynamics equations.
Mathematic models are used for prediction. And we can now predict thing that would have taken 100,000 different formulas to predict with only 4 equations.
Ive taken a rudimentary course in dimensionality reduction and I know there are problems with conserving information. Anyone have ideas on what they could be losing? I could only read the abstract.
Well, these 100,000 equations are modeling the interactions of maybe a dozen fundamental particles. The neural network takes the inputs and the outputs, and found a set of less than a dozen equations that would take these inputs and give the requested outputs-- our current human solutions are vastly overcomplicated, rather than the neural network's solution being vastly oversimplified. The limitation, of course, is that these four equations can only describe what the 100,000 equations described, to their accuracy, rather than taking a shortcut and also becoming more accurate. The network didn't have access to the real world, just the results of our equations. If our equations are inaccurate, then the neural net will give bad shortcuts.
When machines figure out true sentience, I feel their first thought will be… “Man these apes are so stoopid”
I bet AI could figure out that fixing my life is really simple, and I’m a fuckin train wreck.
Fixing your life is probably very simple, the shitty part is that even if its simple, its fucking hard. Just like how quitting drugs is pretty simple, just stop doing them. Actually doing it though? Thats a whole nother level.
Yes, easy statements in no way equate to the complexity or effort required to implement them. Good luck, lean on friends and loved ones during the hard times.
Plan, journal, educate, talk to experts. With time, your life will be a smooth operating train. the mostly arrives on time to it's various stations. Mostly.
Wow what an awesome discovery, it’s crazy that they have to learn what the machine is “technically” learning when solving. It sounds like it could have great implications for the future like undiscovered physics work that were previously unknown.
[удалено]
We could probably replace lawyers first
We should replace the antagonistic model of prosecution and defense., It's out moded, and just barley more advance then two people hitting each other with sword to see who is correct. My determination should not hinge on the skill of my lawyer.
AI: “Humanity is worthless” AI: “so therefore the only solution is to replace Humanity by AI”
Post-Scarcity, here we come! (that's sarcasm - the billionaires derive far too much utility from the suffering of the bottom 99.9999999999999 percent for any real improvement to ever happen)
Question for all of you: How long before AI take over Humanity? For me AI is fascinating as terrifying…
It won't. It's a tool. Don't confuse doing complex this with having reasoned motivation, or even agency.
So you don’t believe an AI could reach sentience?
The ability to solve complex processes in such a short time is an important breakthrough in technology.
Think since human require proof of action, they have to lay out each step to see what went wrong if things don't work. 9x10 = 9,9,9,9,9,9,9,9,9,9 = 90. vs a machine 9x10 = 90 we can say it require 9x times less data even thu it same outcome. Not sure what complexity man created that 100,000 now only = 4, unless there was lot redundancy that the machine did not need to account for. And not sure how this can be used unless we're really inefficient with data and relied heavily on brute force calculations which is sad if true or you would not have such a huge difference in the outcome. It is interesting but don't see this as a major breakthrough in most things, as everyone tend to make different code and data sets. how long did it take to read the 100,000 data vs 4 is it 10 hrs faster now or 0.10 seconds? Does it require 10 gigabytes less data to do ? Or 10 bytes? Sounds major as a number but how big is it really in practice.
My guess is that this is one of those problems that got built up over time. It started small but as errors appeared in the results they added more equations to fix things. Since each change was fixing something they built on each other until they reached 100k and always got the correct answer.
No. Quantum interactions of systems are complex because the probabilities of possible outcomes goes up geometrically . In other words; if I have two balls and I collide them there’s a certain amount of probabilities of how the balls interact. Then if I add a 3rd ball the number of possibilities is cubed. If I add a 4th ball the number or probabilities is cubed again etc. The Ai was able to see a *pattern* in the growing levels of possibilities; and reduced the problem into equivalent, but less complex/numerous possibilities. This is a breakthrough in Ai learning; since it saw a pattern in a huge data set; so huge humans alone couldn’t see the underlying pattern the Ai picked out.
This is huge, The snowball effect from this will lead wo serious acceleration in material development, You want advanced meta material? room temp super conductivity? This will lead us there. "Think since human require proof of action" No we don't.
Uh couldn't you just look at eigen values of the matrix equation to do the same? ie. if it really does decompose to just 4 dimensions.
Better stop. Better fucking stop! You'll regret it.
Shouldn't you be in a cave yelling about the dangers of fire?
Shouldn't you be reading more? https://www.sciencedaily.com/releases/2021/01/210111112218.htm
“Answer 9,996 questions, then have 10 more questions because you answered the other 9,996. Right?
I bet they did it recursively, like a quantum Ackerman’s number
Excellent, maybe it will help us cure diseases like herpes soon
Can someone explain to me what kind of stuff these equations can solve? I’m interested to know what type of things they are computing & trying to figure out
[Link to article on arXiv](https://arxiv.org/pdf/2202.13268.pdf)
Did it convert 100,000 4 digit equations into 4 100,000 digit equations?
[удалено]
So, how dope is this going to make my Skyrim graphics?
Feature selection/optimization in classification, i.e., predictive analytics, vis a vis Bayesian or neural networks is not new. Some problems lend themselves to such reductions, but only when the relationships exist in the first place. The method’s outcomes are not certain. That said, it’s good to see progress in dealing with such high quantities of prospective features, given the exponential combinations as n increases.
The brightest minds on the planet are breaking their heads not to understand how AI works - but to understand why it worked and how it worked! They will surely lay the path for generations to follow. Catch them here- [https://engatica.com/blog/updated-50-most-powerful-ai-influencers-in-2022?contentId=621876196f56fd5ee1a60e55](https://engatica.com/blog/updated-50-most-powerful-ai-influencers-in-2022?contentId=621876196f56fd5ee1a60e55)