Metric anomaly detection via asymmetric risk minimization Conference Paper uri icon

abstract

  • We propose what appears to be the first anomaly detection framework that learns from positive examples only and is sensitive to substantial differences in the presentation and penalization of normal vs. anomalous points. Our framework introduces a novel type of asymmetry between how false alarms (misclassifications of a normal instance as an anomaly) and missed anomalies (misclassifications of an anomaly as normal) are penalized: whereas each false alarm incurs a unit cost, our model assumes that a high global cost is incurred if one or more anomalies are missed. We define a few natural notions of risk along with efficient minimization algorithms. Our framework is applicable to any metric space with a finite doubling dimension. We make minimalistic assumptions that naturally generalize notions such as margin in Euclidean spaces. We provide a theoretical analysis of the risk and show that under mild conditions, our classifier is asymptotically consistent. The learning algorithms we propose are computationally and statistically efficient and admit a further tradeoff between running time and precision. Some experimental results on real-world data are provided.

publication date

  • September 28, 2011