MISC

2013年4月15日

Numerosity Reduction for Resource Constrained Learning

情報処理学会論文誌
  • Khamisi Kalegele
  • ,
  • Hideyuki Takahashi
  • ,
  • Johan Sveholm
  • ,
  • Kazuto Sasai
  • ,
  • Gen Kitagata
  • ,
  • Tetsuo Kinoshita

54
4
記述言語
英語
掲載種別

When coupling data mining (DM) and learning agents, one of the crucial challenges is the need for the Knowledge Extraction (KE) process to be lightweight enough so that even resource (e.g., memory, CPU etc.) constrained agents are able to extract knowledge. We propose the Stratified Ordered Selection (SOS) method for achieving lightweight KE using dynamic numerosity reduction of training examples. SOS allows for agents to retrieve different-sized training subsets based on available resources. The method employs ranking-based subset selection using a novel Level Order (LO) ranking scheme. We show representativeness of subsets selected using the proposed method, its noise tolerance nature and ability to preserve KE performance over different reduction levels. When compared to subset selection methods of the same category, the proposed method offers the best trade-off between cost, reduction and the ability to preserve performance.------------------------------This is a preprint of an article intended for publication Journal ofInformation Processing(JIP). This preprint should not be cited. Thisarticle should be cited as: Journal of Information Processing Vol.21(2013) No.2 (online)DOI http://dx.doi.org/10.2197/ipsjjip.21.329------------------------------When coupling data mining (DM) and learning agents, one of the crucial challenges is the need for the Knowledge Extraction (KE) process to be lightweight enough so that even resource (e.g., memory, CPU etc.) constrained agents are able to extract knowledge. We propose the Stratified Ordered Selection (SOS) method for achieving lightweight KE using dynamic numerosity reduction of training examples. SOS allows for agents to retrieve different-sized training subsets based on available resources. The method employs ranking-based subset selection using a novel Level Order (LO) ranking scheme. We show representativeness of subsets selected using the proposed method, its noise tolerance nature and ability to preserve KE performance over different reduction levels. When compared to subset selection methods of the same category, the proposed method offers the best trade-off between cost, reduction and the ability to preserve performance.------------------------------This is a preprint of an article intended for publication Journal ofInformation Processing(JIP). This preprint should not be cited. Thisarticle should be cited as: Journal of Information Processing Vol.21(2013) No.2 (online)DOI http://dx.doi.org/10.2197/ipsjjip.21.329------------------------------

リンク情報
CiNii Articles
http://ci.nii.ac.jp/naid/110009579578
CiNii Books
http://ci.nii.ac.jp/ncid/AN00116647
URL
http://id.nii.ac.jp/1001/00091597/
ID情報
  • ISSN : 1882-7764
  • CiNii Articles ID : 110009579578
  • CiNii Books ID : AN00116647

エクスポート
BibTeX RIS