python/ML(12)
-
[ML] MNIST_Hand written code
This code is based on Michael Nielsen’s implementation and runs on the MNIST handwritten digit recognition dataset.The key components of this code are the class structure, feedforward process, stochastic gradient descent (SGD), and backpropagation.The code is as follows: def sigmoid(z): return 1.0 / (1.0 + np.exp(-z)) import numpy as np import random def sigmoid_prime(z): """Derivative of..
2025.08.09 -
[ML] Bayesian Concept learning
Before this chapter, we discussed Bayesian learning.The Bayesian formula is as follows:P(y=c∣x)=P(x∣y=c)⋅P(y=c)In this formula:P(x∣y=c) is called the likelihoodP(y=c) is called the priorLet’s dive into an example to better understand this concept.Suppose we want to predict the class label for a given input feature vector.For example, imagine the input features are:x=[0.5, 0.3, 0.7, 0.8]When this..
2025.08.04 -
[Probability] Bayes Rule
According to the definition of conditional probability,P(X = x | Y = y) = p(X = x, Y = y) / p(Y = y).We can rewrite the numerator using the product rule:p(X = x, Y = y) = p(X = x) × p(Y = y | X = x).We can also rewrite the denominator using the law of total probability:p(Y = y) = Σ over x' [ p(X = x') × p(Y = y | X = x') ].For example, if the event Y = y corresponds to "ate melon",then the total..
2025.08.03 -
[Linear_algebra] Null space
Let A be a linear transformation, where A ∈ R^(m×n).The set of all vectors x such that A x = 0 is called the null space of A.In geometric intuition, the null space represents the set of directions that get "squished" to the zero vector by A.For example, letA = [[1, 2], [3, 6]].In this case, the null space is{ [x, y]ᵗ ∈ R² | x = -2y },or equivalently,{ [-2y, y]ᵗ | y ∈ R }.One example from this sp..
2025.08.01 -
[ML_7] Classification by using DecisionTreeClassifier(+)
#%%from sklearn.tree import DecisionTreeClassifier from sklearn.metrics import accuracy_score, roc_curve,roc_auc_score, f1_score, precision_recall_curve from sklearn.model_selection import train_test_split, GridSearchCV from sklearn.preprocessing import LabelEncoder import numpy as np import pandas as pd import matplotlib.pyplot as plt feature_name_df = pd.read_csv("data/har_dataset/features.txt..
2025.07.18 -
[ML_code] visualize_boundary(model, X, y)
import numpy as np# Classifier의 Decision Boundary를 시각화 하는 함수def visualize_boundary(model, X, y): fig,ax = plt.subplots() # 학습 데이타 scatter plot으로 나타내기 ax.scatter(X[:, 0], X[:, 1], c=y, s=25, cmap='rainbow', edgecolor='k', clim=(y.min(), y.max()), zorder=3) ax.axis('tight') ax.axis('off') xlim_start , xlim_end = ax.get_xlim() ylim_start , ylim_end = ax.get_yl..
2025.07.17