로지스틱 회귀모델의 모수 추정
Sigmoid 함수 (logistic function)
sigmoid(x)=11+e−x
Sigmoid 함수 미분
ddxsigmoid(x)=ddx(1+e−x)−1
=(−1)1(1+e−x)2ddx(1+e−x)
=(−1)1(1+e−x)2(0+e−x)ddx(−x)
=(−1)1(1+e−x)2e−x(−1)
=(−1)1(1+e−x)2e−x(−1)
=(−1)1(1+e−x)2e−x(−1)
=(1+e−x)(1+e−x)2−1(1+e−x)2
=11+e−x−1(1+e−x)2
=11+e−x(1−11+e−x)
=sigmoid(x)(1−sigmoid(x))
=σ(x)'=σ(x)(1−σ(x))
Cost 함수
Cost(hθ(x),y)=−ylog(hθ(x))−(1−y)log(1−hθ(x))
전체 Cost 함수
j(θ)=−1m∑mi=1[y(i)log(hθ(x(i)))+(1−y(i))log(1−hθ(x(i)))]
Cost 함수 미분
∂∂θjj(θ)=∂∂θj−1m∑mi=1[y(i)log(hθ(x(i)))+(1−y(i))log(1−hθ(x(i)))]
=−1m∑mi=1[y(i)∂∂θjlog(hθ(x(i)))+(1−y(i))∂∂θjlog(1−hθ(x(i)))]
=−1m∑mi=1[y(i)∂∂θjhθ(x(i))hθ(x(i))+(1−y(i))∂∂θj(1−hθ(x(i)))1−hθ(x(i))]
=−1m∑mi=1[y(i)∂∂θjσ(θTx(i))hθ(x(i))+(1−y(i))∂∂θj(1−σ(θTx(i)))1−hθ(x(i))]
=−1m∑mi=1[y(i)σ(θTx(i))(1−σ(θTx(i)))∂∂θjθTx(i)hθ(x(i))+−(1−y(i))σ(θTx(i))(1−σ(θTx(i)))∂∂θjθTx(i)1−hθ(x(i))]
=−1m∑mi=1[y(i)hθ(x(i))(1−hθ(x(i)))∂∂θjθTx(i)hθ(x(i))+−(1−y(i))hθ(x(i))(1−hθ(x(i)))∂∂θjθTx(i)1−hθ(x(i))]
=−1m∑mi=1[y(i)hθ(x(i))(1−hθ(x(i)))x(i)j+−(1−y(i))hθ(x(i))x(i)j]
=−1m∑mi=1[y(i)hθ(x(i))(1−hθ(x(i)))+−(1−y(i))hθ(x(i))]x(i)j
=−1m∑mi=1[y(i)−y(i)hθ(x(i))−hθ(x(i))+y(i)hθ(x(i))]x(i)j
=−1m∑mi=1[y(i)−hθ(x(i))]x(i)j
=1m∑mi=1[hθ(x(i))−y(i)]x(i)j
Gradient Desent
Repeat{θj :=θj−α∂∂θjJ(θ)}
↓
출처: https://unabated.tistory.com/entry/로지스틱-회귀모델의-모수-추정?category=735138 [랄라라]
'AI, 머신러닝 > 머신러닝' 카테고리의 다른 글
Optimization Algorithms (0) | 2021.05.07 |
---|---|
optimizer 원리 (0) | 2021.05.07 |
로지스틱 함수 (0) | 2021.05.06 |
[MachineLearning] libFM 사용 방법 (0) | 2021.05.03 |
RNN(Recurrent Nueral Networks) (0) | 2021.05.03 |
XGBoost parameters (0) | 2021.05.03 |
Imbalanced data를 처리하는 기술 7가지 (0) | 2021.05.03 |
Class imbalanced problem - 데이터 비대칭 문제 (oversampling, undersampling) (0) | 2021.05.03 |