Transfer Learning Using Feature Selection - Computer Science > LearningReportar como inadecuado

Transfer Learning Using Feature Selection - Computer Science > Learning - Descarga este documento en PDF. Documentación en PDF para descargar gratis. Disponible también para leer online.

Abstract: We present three related ways of using Transfer Learning to improve featureselection. The three methods address different problems, and hence sharedifferent kinds of information between tasks or feature classes, but all threeare based on the information theoretic Minimum Description Length MDLprinciple and share the same underlying Bayesian interpretation. The firstmethod, MIC, applies when predictive models are to be built simultaneously formultiple tasks ``simultaneous transfer- that share the same set of features.MIC allows each feature to be added to none, some, or all of the task modelsand is most beneficial for selecting a small set of predictive features from alarge pool of features, as is common in genomic and biological datasets. Oursecond method, TPC Three Part Coding, uses a similar methodology for the casewhen the features can be divided into feature classes. Our third method,Transfer-TPC, addresses the ``sequential transfer- problem in which the taskto which we want to transfer knowledge may not be known in advance and may havedifferent amounts of data than the other tasks. Transfer-TPC is most beneficialwhen we want to transfer knowledge between tasks which have unequal amounts oflabeled data, for example the data for disambiguating the senses of differentverbs. We demonstrate the effectiveness of these approaches with experimentalresults on real world data pertaining to genomics and to Word SenseDisambiguation WSD.

Autor: Paramveer S. Dhillon, Dean Foster, Lyle Ungar


Documentos relacionados