Giter Site home page Giter Site logo

dml's Introduction

Note: This package has been maintained by @terrytangyuan since 2015. Please consider sponsoring!

JOSS DOI CRAN Status Coverage Status Downloads from the RStudio CRAN mirror License Zenodo DOI

dml (Distance Metric Learning in R)

R package for a collection of Distance Metric Learning algorithms, including global and local methods such as Relevant Component Analysis, Discriminative Component Analysis, Local Fisher Discriminant Analysis, etc. These distance metric learning methods are widely applied in feature extraction, dimensionality reduction, clustering, classification, information retrieval, and computer vision problems.

Installation

Install the current release from CRAN:

install.packages("dml")

Or, try the latest development version from GitHub:

devtools::install_github("terrytangyuan/dml")

Examples

Relevant Component Analysis

library("MASS")

# generate synthetic multivariate normal data
set.seed(42)

k <- 100L # sample size of each class
n <- 3L # specify how many classes
N <- k * n # total sample size

x1 <- mvrnorm(k, mu = c(-16, 8), matrix(c(15, 1, 2, 10), ncol = 2))
x2 <- mvrnorm(k, mu = c(0, 0), matrix(c(15, 1, 2, 10), ncol = 2))
x3 <- mvrnorm(k, mu = c(16, -8), matrix(c(15, 1, 2, 10), ncol = 2))
x <- as.data.frame(rbind(x1, x2, x3)) # predictors
y <- gl(n, k) # response

# fully labeled data set with 3 classes
# need to use a line in 2D to classify
plot(x[, 1L], x[, 2L],
  bg = c("#E41A1C", "#377EB8", "#4DAF4A")[y],
  pch = rep(c(22, 21, 25), each = k)
)
abline(a = -10, b = 1, lty = 2)
abline(a = 12, b = 1, lty = 2)

# generate synthetic chunklets
chunks <- vector("list", 300)
for (i in 1:100) chunks[[i]] <- sample(1L:100L, 10L)
for (i in 101:200) chunks[[i]] <- sample(101L:200L, 10L)
for (i in 201:300) chunks[[i]] <- sample(201L:300L, 10L)

chks <- x[unlist(chunks), ]

# make "chunklet" vector to feed the chunks argument
chunksvec <- rep(-1L, nrow(x))
for (i in 1L:length(chunks)) {
  for (j in 1L:length(chunks[[i]])) {
    chunksvec[chunks[[i]][j]] <- i
  }
}

# relevant component analysis
rcs <- rca(x, chunksvec)

# learned transformation of the data
rcs$A
#>           [,1]       [,2]
#> [1,] -3.181484 -0.8812647
#> [2,] -1.196200  2.3438640

# learned Mahalanobis distance metric
rcs$B
#>           [,1]     [,2]
#> [1,] 10.898467 1.740125
#> [2,]  1.740125 6.924592

# whitening transformation applied to the chunklets
chkTransformed <- as.matrix(chks) %*% rcs$A

# original data after applying RCA transformation
# easier to classify - using only horizontal lines
xnew <- rcs$newX
plot(xnew[, 1L], xnew[, 2L],
  bg = c("#E41A1C", "#377EB8", "#4DAF4A")[gl(n, k)],
  pch = c(rep(22, k), rep(21, k), rep(25, k))
)
abline(a = -15, b = 0, lty = 2)
abline(a = 16, b = 0, lty = 2)

Other Examples

For examples of Local Fisher Discriminant Analysis, please take a look at the separate package here. For examples of all other implemented algorithms, please take a look at the dml package reference manual.

Brief Introduction

Distance metric is widely used in the machine learning literature. We used to choose a distance metric according to a priori (Euclidean Distance , L1 Distance, etc.) or according to the result of cross validation within small class of functions (e.g. choosing order of polynomial for a kernel). Actually, with priori knowledge of the data, we could learn a more suitable distance metric with (semi-)supervised distance metric learning techniques. dml is such an R package aims to implement a collection of algorithms for (semi-)supervised distance metric learning. These distance metric learning methods are widely applied in feature extraction, dimensionality reduction, clustering, classification, information retrieval, and computer vision problems.

Algorithms

Algorithms planned in the first development stage:

  • Supervised Global Distance Metric Learning:

    • Relevant Component Analysis (RCA) - implemented
    • Kernel Relevant Component Analysis (KRCA)
    • Discriminative Component Analysis (DCA) - implemented
    • Kernel Discriminative Component Analysis (KDCA)
    • Global Distance Metric Learning by Convex Programming - implemented
  • Supervised Local Distance Metric Learning:

    • Local Fisher Discriminant Analysis - implemented
    • Kernel Local Fisher Discriminant Analysis - implemented
    • Information-Theoretic Metric Learning (ITML)
    • Large Margin Nearest Neighbor Classifier (LMNN)
    • Neighbourhood Components Analysis (NCA)
    • Localized Distance Metric Learning (LDM)

The algorithms and routines might be adjusted during developing.

Contribute & Code of Conduct

To contribute to this project, please take a look at the Contributing Guidelines first. Please note that this project is released with a Contributor Code of Conduct. By contributing to this project, you agree to abide by its terms.

Contact

Contact the maintainer of this package: Yuan Tang [email protected]

dml's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

dml's Issues

MIT license?

I notice on CRAN that this is listed as MIT, but in this repo there isn't the MIT license text. Are you open to a PR to place this into an MIT license?

GdmFull

GdmFull(data, simi, dism, maxiter = 100)

'data' is a 1728x6 matrix.

With this data, when I ran GdmFull(data, simi, dism, maxiter = 100), this error popped up(when col!=3,this error throw up).

**
Error in vl[[2]] %*% diag(vl[[1]], d) : non-conformable arguments
**

Please let me know how to go about this.

Thank you in advance!

useD can't be set in dca

when I set the parameter "useD" in dca function, such as useD=10(the original dimension is 100), error occurs:
Error in eigen(as.matrix(x)) : infinite or missing values in 'x'

useD is useless, I don't know why.

Looks like GdmFull only works for 3-dimensional dataset

Error in vl[[2]] %*% diag(vl[[1]], d) : non-conformable arguments
In addition: Warning message:
In matrix(x, 3, 3) :
data length [16] is not a sub-multiple or multiple of the number of rows [3]

It looks like GdmFull only works for 3-dimensional dataset.

GDMFull not working with 178x2 data matrix - 'dml' version 1.1.0

GdmFull(data, simi, dism, maxiter = 100)

'data' is a 178x2 matrix, 'simi' and 'dism' are 178x2 matrices describing similar and dissimilar constraints from 'data' matrix.

With this data, when I ran GdmFull(data, simi, dism, maxiter = 100), this error popped up.

**
Error in while (projection.iters < maxiter & satisfy == 0) { :
missing value where TRUE/FALSE needed
**

Please let me know how to go about this.

Thank you in advance!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.