Document Type
Pre-Print
Publication Date
3-2017
Subjects
Mathematical optimization, Hierarchical clustering (Cluster analysis), Convex functions, Smoothing (Numerical analysis)
Abstract
A bilevel hierarchical clustering model is commonly used in designing optimal multicast networks. In this paper we will consider two different formulations of the bilevel hierarchical clustering problem -- a discrete optimization problem which can be shown to be NP-hard. Our approach is to reformulate the problem as a continuous optimization problem by making some relaxations on the discreteness conditions. This approach was considered by other researchers earlier, but their proposed methods depend on the square of the Euclidian norm because of its differentiability. By applying the Nesterov smoothing technique and the DCA -- a numerical algorithm for minimizing differences of convex functions -- we are able to cope with new formulations that involve the Euclidean norm instead of the squared Euclidean norm. Numerical examples are provided to illustrate our method.
Locate the Document
This article was originally published on arXiv.org and can be found here.
Persistent Identifier
http://archives.pdx.edu/ds/psu/19539
Citation Details
Nam, N. M., Geremew, W., Raynolds, S., & Tran, T. (2017). The Nesterov Smoothing Technique and Minimizing Differences of Convex Functions for Hierarchical Clustering. arXiv preprint arXiv:1701.04464.