plasma_phys
plasma_phys t1_irpc0n0 wrote
Reply to How fast do bubbles rise in water? by crazunggoy47
I don't know about water specifically, except that two-phase flow is a very heavily studied phenomenon for water cooled power plants, but the amusingly titled Physics Today article Through a Beer Glass Darkly walks through a simple physics model of bubble-in-liquid physics for the case of dissolved CO2 in beer - with experimental data!
plasma_phys t1_j4748z8 wrote
Reply to What exactly is the process when someone "trains" an AI to learn or do something? by kindofaboveaverage
In the simplest case, you start with an untrained AI (some mathematical model with variable parameters) and training data for which you already know the desired output (supervised learning). Initially, the AI produces nonsense when given the training data, so you repeatedly make small changes to the parameters of the AI, making sure that the actual output gets closer and closer to the desired output. At some point, the actual output is close enough to the desired output that you stop - the AI has been trained, and when given data sufficiently similar to the training data will produce the desired output even though the AI has never encountered that specific data before.
It obviously gets more complicated, especially when you don't already know your desired output (unsupervised learning) or in more complex designs such as generative adversarial networks. Some machine learning approaches typically use specific algorithms for training, such as the Baum-Welch algorithm for Hidden Markov Models, while others may use generic optimization algorithms. In general though, the process of repeatedly making small changes and comparing the new result to the previous one is a largely universal part of training AI.