An adaptive method of numerical attribute merging for quantitative association rule mining

Jiuyong Li, Hong Shen, Rodney Topor

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

7 Citations (Scopus)


Mining quantitative association rules is an important topic of data mining since most real world databases have both numerical and categorical attributes. Typical solutions involve partitioning each numerical attribute into a set of disjoint intervals, interpreting each interval as an item, and applying standard boolean association rule mining. Commonly used partitioning methods construct set of intervals that either have equal width or equal cardinality. We introduce an adaptive partitioning method based on repeatedly merging smaller intervals into larger ones. This method provides an effective compromise between the equal width and equal cardinality criteria. Experimental results show that the proposed method is an effective method and improves on both equalwidth partitioning and equal-cardinality partitioning.

Original languageEnglish
Title of host publicationInternet Applications - 5th International Computer Science Conference ICSC 1999, Proceedings
EditorsLucas Chi-Kwong Hui, Dik Lun Lee
PublisherSpringer Verlag
Number of pages12
ISBN (Print)3540669035, 9783540669036
Publication statusPublished - 1999
Externally publishedYes
Event5th International Computer Science Conference, ICSC 1999 - Hong Kong, China
Duration: 13 Dec 199915 Dec 1999

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349


Conference5th International Computer Science Conference, ICSC 1999
CityHong Kong


  • Association rule
  • Continuous attribute discretization
  • Data mining


Dive into the research topics of 'An adaptive method of numerical attribute merging for quantitative association rule mining'. Together they form a unique fingerprint.

Cite this