CS231N assignment1 SVM

以你之姓@ 2022-05-10 07:20 320阅读 0赞

from cs231n.classifiers.softmax import softmax_loss_naive
线性分类器SVM,分成两个部分
1.a score function that maps the raw data to class scores,也就是所谓的f(w,x)函数
2.a loss function that quantifies the agreement between the predicted scores and the ground truth labels
在这里插入图片描述

margin:

SVM loss function wants the score of the correct class yi to be larger than the incorrect class scores by at least by Δ (delta). If this is not the case, we will accumulate loss.

example

在这里插入图片描述

1. loss function

cs231n/classifiers/linear_softmax.py中

softmax_loss_naive

SVM想让正确类别的score比错误类别的score要高出一个固定的margin Δ.
svm的损失函数计算方法

  1. Inputs:
  2. - W: A numpy array of shape (D, C) containing weights.
  3. - X: A numpy array of shape (N, D) containing a minibatch of data.
  4. - y: A numpy array of shape (N,) containing training labels; y[i] = c means
  5. that X[i] has label c, where 0 <= c < C.
  6. - reg: (float) regularization strength

在这里插入图片描述

  1. for i in xrange(num_train):#0-N
  2. scores = X[i].dot(W) ##1*C
  3. correct_class_score = scores[y[i]]
  4. for j in xrange(num_classes):# 0-C
  5. if j == y[i]:
  6. continue
  7. margin = scores[j] - correct_class_score + 1 # note delta = 1
  8. if margin > 0:
  9. loss += margin
  10. dW[:,j] += X[i].T
  11. dW[:,y[i]] -= X[i].T

softmax_loss_vectorized

loss

  1. scores = X.dot(W)
  2. yi_scores = scores[np.arange(scores.shape[0]),y]
  3. margins = np.maximum(0, scores - np.matrix(yi_scores).T + 1)
  4. margins[np.arange(num_train),y] = 0
  5. loss = np.mean(np.sum(margins, axis=1))
  6. loss += 0.5 * reg * np.sum(W * W)

####参考
cs231n linear classifier SVM
https://mlxai.github.io/2017/01/06/vectorized-implementation-of-svm-loss-and-gradient-update.html

发表评论

表情:
评论列表 (有 0 条评论,320人围观)

还没有评论,来说两句吧...

相关阅读

    相关 cs231n笔记

    2.1. 图像分类 - K最近邻算法(KNN) k最近邻算法: 图中的点是输入的训练数据以及lable(蓝色,红色,黄色。。。。),然后整个图的染色过程就是测试过程

    相关 Computer Vision:CS231n

    CS231n是斯坦福大学SVL实验室(Stanford Vision and Learning Lab)主导开设为期10周的计算机视觉课程,主讲为李飞飞教授,相关链接如下: