전체 글(334)
-
[Linear_Algebra] Gauss Elimination & Gauss-Jordan Elimination
Today, we are going to talk about Gauss Elimination and Gauss–Jordan Elimination, which are crucial for solving linear equations. We discussed Gauss Elimination in the previous post. First of all, when GE is performed, the matrix is converted into Row Echelon Form. The next step is back substitution, which means going upward in the matrix to solve for the variables. On the other hand, when GJ is..
2025.08.21 -
[Linear_Algebra] Gaussian elimination and LU factorization
Gaussian elimination(GE)를 이용해 linear equation의 해를 구할 수 있다.하지만, 이 과정을 컴퓨터 프로그래밍으로 계산하기 위해서는 행렬로 표현할 수 있어야 한다. 이때 사용하는 방법을 알아보자. 1. Elementary matrixLet A be the original matrix. 이런 조건에서 첫 번째 행을 상수배하여 두 번째 행에 더하거나 빼는 과정을 하게 된다. 이러한 연산 과정을 행렬로 나타낸 것을 E21이라고 한다. E21은 identity matrix에서 (2,1) 위치에 상수를 넣어 만든 행렬이다. 또 다른 예시로는 E32가 있다. 이 행렬은 identity matrix의 (3,2) 위치에 coefficient를 넣은 행렬이다.Elementary matri..
2025.08.20 -
[Linear_algebra] Linearlity
What means linear? 1. f(x1 + x2) = f(x1) + f(x2) 2. f(cx1) = cf(x1) So, we are using this formula which equation meets linearlity f(ax1 + bx2) = af(x1) + bf(x2)
2025.08.20 -
[python] class ; single underscore, double underscore
When we are making some class, we often see the _ or __ in class.But what do they mean?Actually, _ means it is intended to be used inside the class, but it's not enforced.__ means it is also intended to be used inside the class, but with stronger protection. In this case, __hidden naming triggers Name Mangling, so the __hidden is gonna change to _ClassName__hidden. class nn: def __init__(sel..
2025.08.17 -
[ML] MNIST_Hand Digit with CrossEntropy and Matrix for
In the previous post, we trained an MNIST network with the quadratic (MSE) cost and a per-sample backprop loop.In this post, we switch to the Cross-Entropy (CE) cost and a mini-batch (vectorized) update. With sigmoid + CE (binary) or softmax + CE (multiclass), the output error isThere is no extra σ′(z) factor multiplying the error at the output layer.This avoids the severe gradient shrinkage you..
2025.08.12 -
[ML] MNIST_Hand written code
This code is based on Michael Nielsen’s implementation and runs on the MNIST handwritten digit recognition dataset.The key components of this code are the class structure, feedforward process, stochastic gradient descent (SGD), and backpropagation.The code is as follows: def sigmoid(z): return 1.0 / (1.0 + np.exp(-z)) import numpy as np import random def sigmoid_prime(z): """Derivative of..
2025.08.09