Bring Your Own IP addresses: the secret to Bitly’s shortened cloud migration
October 29, 2019What’s happening in BigQuery: New features bring flexibility and scale to your data warehouse
October 29, 2019Types of simulations
Ever since computers were invented, we have used them for simulations. Some of the very first computers were built explicitly to simulate complex physical phenomena, from ballistics to weather. And many of the largest supercomputers today are used for simulation as well, from finding new chemical compounds to predicting climate change.
I will distinguish two different types of simulations: high precision scientific (or engineering) simulation and large scale probabilistic simulation.
Scientific and engineering simulations typically require a very high degree of accuracy. If we want to understand how weather works, we need to model fluid dynamic equations, including factors such as temperature and density, and have a very high resolution model of the world. If we are building a new turbine fan, it’s necessary to understand at a very fine level of detail the fan’s physical properties. Factors such as thermal conductivity or elasticity may play an important role when the fan is rotating at tens of thousands of revolutions per minute and operating at temperatures of thousands of degrees Fahrenheit. Microscopic and quantum effects may be critical to understanding a particular physical system.
In probabilistic simulations, we typically have rougher models of reality, but we need to explore a vast amount of options. Many of these simulations originate from traditional Monte Carlo methods (popularized by Ulam and others). Here, we try to obtain an approximate solution, or range of solutions, by running through many slight variations of a simulation scenario. If we are trying to guess whether a particular city grid will see a lot of traffic, we can simulate thousands of different random scenarios with various numbers of vehicles, destinations and speeds. The actual physical characteristics of the car engines don’t matter as much as the higher level properties of the simulated world.
Scientific simulations typically need moderate amounts of very precise data, are very hard to build and scale, and may require very expensive dedicated supercomputers. Large scale probabilistic simulation, on the other hand, is inherently parallel, and can be built and scaled up effectively in the cloud. Because of this, these simulations can be an ideal data source for the creation of training sets for AI models, as well as to enable the end to end testing of AI systems.