Oleg Zabluda's blog
Friday, November 11, 2016
 
"""
"""
Brian Kenji Iwana and Seiichi Uchida at Kyushu University in Japan [...] trained a deep neural network to study book covers and determine the category of book they come from. Their method is straightforward. Iwana and Uchida downloaded 137,788 unique book covers from Amazon.com along with the genre of book. There are 20 possible genres but where a book was listed in more than one category, the researchers used just the first.
"""
https://www.technologyreview.com/s/602807/deep-neural-network-learns-to-judge-books-by-their-covers/
https://www.technologyreview.com/s/602807/deep-neural-network-learns-to-judge-books-by-their-covers

Labels:


 
"""
"""
To better understand and combat [cancer], medical researchers rely on cancer registry programs—a national network of organizations that systematically collect demographic and clinical information related to the diagnosis, treatment, and history of cancer incidence in the United States. [...] Much of this data is drawn from electronic, text-based clinical reports that must be manually curated—a time-intensive process—before it can be used in research. For example, cancer pathology reports, text documents that describe cancerous tissue in detail, must be individually read and annotated by experts before becoming part of a cancer registry. With millions of new reports being produced each year [...] “The manual model is not scalable,”
[...]
Since 2014 Tourassi has led a team focused on creating software that can quickly identify valuable information in cancer reports, [...] Using a dataset composed of 1,976 pathology reports provided [...] Tourassi’s team trained a deep-learning algorithm to carry out two different but closely related information-extraction tasks. In the first task the algorithm scanned each report to identify the primary location of the cancer. In the second task the algorithm identified the cancer site’s laterality—or on which side of the body the cancer was located.

By setting up a neural network designed to exploit the related information shared by the two tasks, an arrangement known as multitask learning, the team found the algorithm performed substantially better than competing methods.
"""
https://www.olcf.ornl.gov/2016/11/08/accelerating-cancer-research-with-deep-learning/
https://www.olcf.ornl.gov/2016/11/08/accelerating-cancer-research-with-deep-learning

Labels:


 
Oxford and Cambridge are losing AI researchers to DeepMind
Oxford and Cambridge are losing AI researchers to DeepMind
"""
Demis Hassabis, Mustafa Suleyman, and Shane Legg — have all studied or worked at either Oxford or Cambridge at some point [...] Oxford and Cambridge appear to be the most popular hiring grounds.

Matthew Grimes, for example, was a Cambridge postdoctoral research associate focusing on applying deep learning to vision from October 2013 to December 2015. He joined DeepMind in January 2016 as a research scientist.

Then there's Yutian Chen, who was a research associate at Cambridge from August 2013 to August 2015. [...] Andrew Ballard, meanwhile, was a postdoctoral researcher at Cambridge from 2012 to 2015. He left to join DeepMind in October 2015 as a researcher.

Elsewhere, Irina Higgins left her role as a machine learning tutor at Oxford in April 2015 before joining DeepMind as a research scientist in June of the same year. Higgins also completed a PhD in computational neuroscience and artificial intelligence at Oxford before leaving to join DeepMind.
[...]
Edward Grefenstette, Nando de Freitas, and Karl Hermann — the cofounders of Oxford spinout Dark Blue Labs — have all left their research positions at Oxford and joined DeepMind, which makes sense, given their startup was acquired by DeepMind.

DeepMind has also hired Martin Szummer, who was once a senior researcher at Cambridge. He left the university to focus on his startup, VocalIQ, which was acquired by Apple in 2014.
"""
http://www.businessinsider.com/oxbridge-ai-researchers-to-deepmind-2016-11
http://www.businessinsider.com/oxbridge-ai-researchers-to-deepmind-2016-11

Labels:


 
LipNet: Sentence-level Lipreading (2016) Yannis M. Assael et al
LipNet: Sentence-level Lipreading (2016) Yannis M. Assael et al
"""
More recent deep lipreading approaches are end-to-end trainable (Wand et al., 2016; Chung & Zisserman, 2016a). All existing works, however, perform only word classification, not sentence-level sequence prediction. [...] LipNet, a model that maps a variable-length sequence of video frames to text, making use of spatiotemporal convolutions, an LSTM recurrent network, and the connectionist temporal classification loss, trained entirely end-to-end. To the best of our knowledge, LipNet is the first lipreading model to operate at sentence-level, using a single end-to-end speaker-independent deep model to simultaneously learn spatiotemporal visual features and a sequence model. On the GRID corpus, LipNet achieves 93.4% accuracy, outperforming experienced human lipreaders and the previous 79.6% state-of-the-art accuracy.
"""
https://arxiv.org/abs/1611.01599

https://www.youtube.com/watch?v=fa5QGremQf8
https://www.youtube.com/watch?v=fa5QGremQf8

Labels:


 
"""
"""
Whirlpool and other companies are reshaping their factory floors around “collaborative robots” that can stop if a person bumps into them. That precaution allows them to operate in tight spaces with little or no protective boundary. [...] they make U.S. factories more competitive versus countries with cheaper wages. [...] North American manufacturers installed more than 28,000 robots last year [...] Collaborative robots likely make up just a fraction of the total [...] Fanuc America has sold fewer than 100 of the devices [...] They aren’t as strong as bigger robots, and because they move more slowly to keep their human neighbors safe, they aren’t always as efficient.
[...]
Universal Robots AS of Denmark sells one-arm robots for up to $45,000. This type of robot can work around the clock, taking the place of workers on three shifts. The average production worker makes $36,220 year, not including overtime, health and other benefits, according to the Labor Department. Manufacturing executives also say the robots save on materials costs because they apply materials like glue more efficiently.
"""
http://www.wsj.com/articles/latest-robots-lend-a-helping-arm-at-factories-1478624535
http://www.wsj.com/articles/latest-robots-lend-a-helping-arm-at-factories-1478624535

Labels:



Powered by Blogger