sigmoid 的数学与计算实现

Python
machine-learning
Author

Shixiang Wang

Published

October 21, 2024

深度学习中的 Sigmoid 激活函数的定义公式为:

\[ y_k = \frac{exp(a_k)}{\sum_{i=1}^{n}exp(a_i)} \]

在计算实现上优化后的公式为:

\[ y_k = \frac{exp(a_k + C)}{\sum_{i=1}^{n}exp(a_i + C)} \]

将 C 设定为 \(exp(a_i)\) 的最大值,就可以规避计算机计算溢出问题。

import numpy as np

def softmax(a):
    exp_a = np.exp(a)
    sum_exp_a = np.sum(exp_a)
    y = exp_a / sum_exp_a

    return y

def softmax2(a):
    c = np.max(a)
    exp_a = np.exp(a - c)  # 溢出对策
    sum_exp_a = np.sum(exp_a)
    y = exp_a / sum_exp_a

    return y

print(f"softmax(10) = {softmax(np.array([1, 10]))}")
print(f"softmax(1000) = {softmax(np.array([1, 1000]))}")

print(f"softmax2(10) = {softmax2(np.array([1, 10]))}")
print(f"softmax2(1000) = {softmax2(np.array([1, 1000]))}")

结果:

 python sigmoid_2ways.py
softmax(10) = [1.23394576e-04 9.99876605e-01]
/Users/family/Library/CloudStorage/OneDrive-shanghaitech.edu.cn/Learn/deeplearning/deep_learning_demo/ch03/sigmoid_2ways.py:4: RuntimeWarning: overflow encountered in exp
  exp_a = np.exp(a)
/Users/family/Library/CloudStorage/OneDrive-shanghaitech.edu.cn/Learn/deeplearning/deep_learning_demo/ch03/sigmoid_2ways.py:6: RuntimeWarning: invalid value encountered in divide
  y = exp_a / sum_exp_a
softmax(1000) = [ 0. nan]
softmax2(10) = [1.23394576e-04 9.99876605e-01]
softmax2(1000) = [0. 1.]
本站总访问量 次(来源不蒜子按域名记录)