Oleg Zabluda's blog
Wednesday, May 16, 2018
 
"""
"""
This is code implements the example given in pages 11-15 of An Introduction to the Kalman Filter (http://www.cs.unc.edu/~welch/kalman/kalmanIntro.html)
"""
http://scipy-cookbook.readthedocs.io/items/KalmanFiltering.html
http://scipy-cookbook.readthedocs.io/items/KalmanFiltering.html

Labels:


 
Lies, Damn Lies and Inflation
Lies, Damn Lies and Inflation
"""
inflation figures have quite different meanings, thanks to different treatment of housing and, to a lesser extent, health care. The most dramatic difference is housing: In the U.S., shelter makes up a third of the consumer-price index, because it includes an imputed rent for homeowners. In Europe only actual rents are measured, at a weight of just 6% of the basket of goods and services underlying the price index.

Measure both using the European approach [...] this measure—known as the harmonized index of consumer prices, or HICP [...] which is still an experimental statistic, but the CPI excluding shelter, food and energy is the best equivalent to core eurozone inflation, and exactly the same at 1.2%.
[...]
The Fed’s preferred inflation gauge, the PCE price index, takes a sixth of its weight from rent and imputed rent. The gap from CPI weights is made up mostly by including employers’ health-care costs to get a health-care weight of a fifth. In Europe the equivalent health-care costs, mostly borne by government, are ignored in HICP, and booze and smokes are almost as important as health-care in determining inflation.
[...]
Central bankers, like all economists, form their views from statistics that often turn out to be wrong or misleading, half-truths about how the economy works and guesses as to how markets will respond.
"""
https://www.wsj.com/articles/lies-damn-lies-and-inflation-1526316237
https://www.wsj.com/articles/lies-damn-lies-and-inflation-1526316237

Labels:


 
List of nicknames used by Donald Trump
List of nicknames used by Donald Trump
https://en.wikipedia.org/wiki/List_of_nicknames_used_by_Donald_Trump

https://en.wikipedia.org/wiki/List_of_nicknames_used_by_Donald_Trump

Labels:


 
"""
"""
since 2012, the amount of compute used in the largest AI training runs has been increasing exponentially with a 3.5 month-doubling time [...] has grown by more than 300,000x [...] The trend represents an increase by roughly a factor of 10 each year.
[...]
Eras

Looking at the graph we can roughly see four distinct eras:

Before 2012: It was uncommon to use GPUs for ML, making any of the results in the graph difficult to achieve.

2012 to 2014: Infrastructure to train on many GPUs was uncommon, so most results used 1-8 GPUs rated at 1-2 TFLOPS for a total of 0.001-0.1 pfs-days.

2014 to 2016: Large-scale results used 10-100 GPUs rated at 5-10 TFLOPS, resulting in 0.1-10 pfs-days. Diminishing returns on data parallelism meant that larger training runs had limited value.

2016 to 2017: Approaches that allow greater algorithmic parallelism such as huge batch sizes, architecture search, and expert iteration, along with specialized hardware such as TPU’s and faster interconnects, have greatly increased these limits, at least for some applications. AlphaGoZero/AlphaZero is the most visible public example of massive algorithmic parallelism,
[...]
The world’s total hardware budget is 1 trillion dollars a year (https://www.statista.com/statistics/422802/hardware-spending-forecast-worldwide/)
[...]
Appendix: Recent novel results that used modest amounts of compute
[...]
Here are some examples of [recent noteworthy] results using modest compute
[...]
Attention is all you need: 0.089 pfs-days (6/2017)
Adam Optimizer: less than 0.0007 pfs-days (12/2014)
Learning to Align and Translate: 0.018 pfs-days (09/2014)
GANs: less than 0.006 pfs-days (6/2014)
Word2Vec: less than 0.00045 pfs-days (10/2013)
Variational Auto Encoders: less than 0.0000055 pfs-days (12/2013)
"""
https://blog.openai.com/ai-and-compute/
https://blog.openai.com/ai-and-compute/

Labels:



Powered by Blogger