SDSC6015 课程 5-镜像梯度下降与随机梯度下降
#sdsc6015 English / 中文 Mirror Descent 点击展开 Mirror Descent 复习内容 动机 考虑单纯形约束优化问题: minx∈△df(x)\min_{x \in \triangle_d} f(x) x∈△dminf(x) 其中单纯形 △d:={x∈Rd:∑i=1dxi=1,xi≥0,∀i}\triangle_d := \{x \in \mathbb{R}^d : \sum_{i=1}^d x_i = 1, x_i \geq 0, \forall i\}△d:={x∈Rd:∑i=1dxi=1,xi≥0,∀i}。假设梯度无穷范数有界:∥∇f(x)∥∞=maxi=1,…,d∣[∇f(x)]i∣≤1\|\nabla f(x)\|_\infty = \max_{i=1,\ldots,d} |[\nabla f(x)]_i| \leq 1∥∇f(x)∥∞=maxi=1,…,d∣[∇f(x)]i∣≤1。 符号说明:xxx 是优化变量,ddd 是维度,△d\triangle_d△d 是概率单纯形。 几何意义:单纯形是概率...
SDSC6015 课程 4-投影梯度下降、近段梯度下降与镜像梯度下降初步
#sdsc6015 English / 中文 投影梯度下降 (Projected Gradient Descent) 投影梯度下降是处理约束优化问题的算法,通过梯度步后投影回可行集来确保约束满足。 约束优化问题定义 约束优化问题形式化定义为: minf(x)subject tox∈X\begin{aligned} &\min f(x) \\ &\text{subject to}\quad x \in X \end{aligned} minf(x)subject tox∈X 其中: f:Rd→Rf: \mathbb{R}^d \rightarrow \mathbb{R}f:Rd→R 是目标函数 X⊆RdX \subseteq \mathbb{R}^dX⊆Rd 是一个闭凸集(closed convex set) x∈Rdx \in \mathbb{R}^dx∈Rd 是优化变量 几何意义:在满足约束 x∈Xx \in Xx∈X 的前提下,寻找使 f(x)f(x)f(x) 最小的点。 算法描述 投影梯度下降迭代步骤: For t=0,1,2,… ...
SDSC6012 - Assignment 1
#assignment #sdsc6012
SDSC6012 - Question of Assignment 2
#assignment #sdsc6012
SDSC6012 - Assignment 1
SDSC6012 - Assignment 1 #assignment #sdsc6012 Question 1 Trend Component Extraction (Moving Average Method) The trend component is extracted using the centered moving average method: Trendt=1k∑i=t−mt+mxi\text{Trend}_t = \frac{1}{k} \sum_{i=t-m}^{t+m} x_i Trendt=k1i=t−m∑t+mxi Where: kkk is the window size (here set to 12, corresponding to the annual cycle) m=⌊k/2⌋m = \lfloor k/2 \rfloorm=⌊k/2⌋ (half-window width for centered moving average) Boundary handling: when t<mt < mt&...
SDSC6007 - Assignment 2
SDSC6007 - Assignment 2 #assignment #sdsc6007 题目链接SDSC6007 - Question of Assignment 2
SDSC6007 - Question of Assignment 2
#assignment #sdsc6007
SDSC5001 Course 4-Linear Regression
#sdsc5001 English / 中文 Simple Linear Regression Basic Setup Given data (x1,y1),…,(xn,yn)\left(x_{1}, y_{1}\right),\ldots,\left(x_{n}, y_{n}\right)(x1,y1),…,(xn,yn), where: xi∈Rx_{i} \in \mathbb{R}xi∈R is the predictor variable (independent variable, input, feature) yi∈Ry_{i} \in \mathbb{R}yi∈R is the response variable (dependent variable, output, outcome) The regression function is expressed as: y=f(x)+εy = f(x) + \varepsilon y=f(x)+ε The linear regression model assumes: f(x)=β0+β...
SDSC5001 课程 4-线性回归
#sdsc5001 English / 中文 Simple Linear Regression 基本设定 给定数据 (x1,y1),…,(xn,yn)\left(x_{1}, y_{1}\right),\ldots,\left(x_{n}, y_{n}\right)(x1,y1),…,(xn,yn),其中: xi∈Rx_{i} \in \mathbb{R}xi∈R 是预测变量(自变量、输入、特征) yi∈Ry_{i} \in \mathbb{R}yi∈R 是响应变量(因变量、输出、结果) 回归函数表示为: y=f(x)+εy = f(x) + \varepsilon y=f(x)+ε 线性回归模型假设: f(x)=β0+β1xf(x) = \beta_0 + \beta_1 x f(x)=β0+β1x 这通常被视为对真实关系的近似。 示例(附件页码2):一个简单的玩具示例展示数据点和线性拟合关系。 最小二乘拟合 通过最小化残差平方和来估计参数: minβ0,β1∑i=1n(yi−(β0+β1xi))2\min_{\beta_0, \beta_1...
SDSC6007 课程 3-Tutorial与隐马尔可夫模型
#sdsc6007 English / 中文 Tutorial 1 问题设定 动态系统: xk+1=xk+uk+wkx_{k+1} = x_k + u_k + w_kxk+1=xk+uk+wk, k=0,1,2,3k = 0,1,2,3k=0,1,2,3 初始状态: x0=5x_0 = 5x0=5 成本函数: ∑k=03(xk2+uk2)\sum_{k=0}^{3}(x_k^2 + u_k^2)∑k=03(xk2+uk2) 状态空间: Sk={0,1,2,3,4,5}S_k = \{0,1,2,3,4,5\}Sk={0,1,2,3,4,5} 控制约束: Uk(xk)={u∣0≤xk+u≤5,u∈Z}U_k(x_k) = \{u | 0 \leq x_k + u \leq 5, u \in \mathbb{Z}\}Uk(xk)={u∣0≤xk+u≤5,u∈Z} 随机干扰: 如果 0<xk+uk<50 < x_k + u_k < 50<xk+uk<5: wk={−1概率121概率12w_k = ...
