鸡精和味精有什么区别| 眼轴是什么| 肩胛骨疼挂什么科| 42是什么意思| 女人梦见棺材是什么征兆| 红花代表什么生肖| 夏天可以种什么花| 孤帆远影碧空尽的尽是什么意思| 老虔婆是什么意思| 血小板低会有什么症状| 什么门比较好| 蓬头垢面是什么意思| 医保定点医院是什么意思| 鸡蛋吃多了有什么坏处| 吃完麻辣烫吃什么补救| 淀粉酶高是什么原因| 什么首阔步| 什么人不能喝牛奶| 低密度脂蛋白高的原因是什么| 歧视什么意思| 始于初见止于终老是什么意思| 违和是什么意思| 万能输血者是什么血型| 明月几时有的下一句是什么| 央企和国企有什么区别| 炎症是什么意思| 脑白质脱髓鞘吃什么药| 九月初十是什么星座| 为什么拉屎是黑色的| 怎么看自己五行属什么| 凌霄花什么时候开花| fashion什么意思| 西安五行属什么| 聊天是什么意思| 芼什么意思| 食物中毒吃什么解毒最快| 靶向是什么意思| ozark是什么牌子| 什么是义务兵| 李世民是什么生肖| b是什么牌子的衣服| 冲服是什么意思| 化疗后吃什么增强免疫力| 男性尿很黄是什么原因| 血小板低有什么症状| 囚徒是什么意思| 12岁属什么| 善存片什么时候吃最好| 气性坏疽是什么病| 什么牌子的麦克风好用| 什么的雷雨| 大腿外侧疼痛是什么原因| 明年属什么生肖| 器质性疾病是什么意思| 四维是什么意思| 容易手麻脚麻是什么原因| 7月有什么活动| asa是什么意思| 为什么流鼻血| 哦什么意思| oder是什么意思| 天时地利人和是什么意思| 睾丸皮痒用什么药膏| 小孩经常口腔溃疡是什么原因| 爱出汗挂什么科| 蜻蜓吃什么| 什么东西解酒最好最快| 抠脚大汉什么意思| 刘禹锡是什么朝代的| 12.18是什么星座| 为什么拼音| 白羊座的幸运色是什么颜色| 溃烂用什么药治愈最快| 闪光感是什么感觉| 毛尖属于什么茶| 胃寒吃什么中成药| 水鱼煲鸡汤放什么药材| 肾结石可以吃什么| 人天是什么意思| 中性粒细胞偏高是什么原因| 心脏呈逆钟向转位什么意思| 碧螺春属于什么茶类| 宝宝拉肚子吃什么药好| 什么叫肛瘘| 据悉是什么意思| 肾功能不全是指什么| 牛黄安宫丸治什么病| 每天尿都是黄的是什么原因| 大麦和小麦有什么区别| 吃什么清肺效果最好| 土命和什么命最配| 人工授精是什么意思| 二百五是什么意思| 扁平疣用什么药膏| 蓝精灵是什么药| 什么主食含糖量低| 氯超标是因为什么原因| 小脑萎缩吃什么药效果最好| 肚脐眼连着什么器官| 什么时辰出生的人命好| 做梦掉牙齿是什么意思周公解梦| 粘是什么意思| 晚上睡觉出汗是什么原因| 上热下寒吃什么药| 新加坡为什么说中文| 胃胀呕吐是什么原因| 验孕棒一条杠什么意思| 西兰花不能和什么一起吃| 腰突然疼是什么原因| 悬案是什么意思| 飞机什么时候开始登机| nsaids是什么药| 男戴观音女戴佛有什么讲究| 1.27是什么星座| 什么是君子| 祈是什么意思| 冲突是什么意思| 向心性肥胖是什么意思| 手脚热吃什么药| 肾脏挂什么科| 鱼什么而什么| 新陈代谢是什么意思| 花序是什么意思| 尿频尿多吃什么药好| 腋下异味用什么药| 高血压吃什么菜| 黄皮是什么| 心肌供血不足是什么原因造成的| 什么帽子不能戴| 新疆有什么烟| 夏天煲什么汤好| 事业编制是什么意思| 香醋是什么醋| hpv56阳性是什么意思| 天蝎座有什么特点| 6542是什么药| 鸭肉和什么不能一起吃| uvb是什么意思| 眼镜轴位是什么| 阿飞是什么意思| bug是什么意思| 单于是什么意思| 化学阉割是什么| 什么地飞翔| 上火喝什么比较好| 女人后脑勺出汗多是什么原因| 微笑表情代表什么意思| 胆碱酯酶高是什么原因| 翠花是什么意思| 化石是什么| 睡觉时头晕是什么原因| 煜字五行属什么| 皮肤干燥缺什么维生素| 脚趾第二个比第一个长有什么说法| 喝什么茶能降低血糖| 来月经有异味什么原因| 渚是什么意思| 蛀牙是什么原因引起的| 眼睛发黄是什么原因引起的| 什么口红好| 看正月初一是什么生肖| 你有一双会说话的眼睛是什么歌| 梦见自己抬棺材是什么意思| 网名叫什么好听| 梦见腿断了是什么意思| 用维生素e擦脸有什么好处和坏处| 今天属相是什么生肖| 大拇指疼是什么原因| 猫翘尾巴是什么意思| 硒是什么意思| 多管闲事是什么意思| 炮灰是什么意思| 狐仙一般找什么人上身| visa卡是什么| 走路出汗多是什么原因| 酵母菌属于什么菌| 蒙古古代叫什么| sanag是什么牌子| 侏罗纪是什么意思| 肺和大肠相表里是什么意思| 护照免签是什么意思| 媱字五行属什么| 宸字五行属什么| 眩晕症是什么原因引起的| 立棍是什么意思| 谁的尾巴有什么作用| 什么是棱长| 怀孕了用排卵试纸测会显示什么| alb医学上是什么意思| 轮状病毒吃什么药| 头晕是什么病的前兆| 建卡需要带什么证件| 脐带血能治疗什么病| 狗肉不能和什么食物一起吃| 低密度脂蛋白偏高什么意思| 双侧卵巢显示不清是什么意思| 基数是什么| 吉数是什么生肖| 阴道炎吃什么| 情定三生大结局是什么| 备孕期间要注意什么| 什么饼不能吃脑筋急转弯| 遭罪什么意思| 晚上睡觉流口水什么原因| 螳螂捕蝉黄雀在后是什么意思| 同学群取什么名字好| 儿童舌系带短挂什么科| 支气管病变什么意思| jo是什么意思| 严重脱发是什么病先兆| 准生证需要什么材料| 孕妇吸氧对胎儿有什么好处| 高血糖能吃什么| 虾青素有什么功效| 纳肛是什么意思| 不讲武德什么意思| 千什么百什么| 灰指甲用什么药膏| 梦见手机摔碎了是什么意思| 铁低的原因是什么| 衣带渐宽终不悔是什么意思| 怕什么| 天津市市长什么级别| 上海松江有什么好玩的地方| 双肺局限性气肿是什么病| 寂寞的近义词是什么| 中水是什么水| 清华大学校长什么级别| 坐围是什么| 牛油果吃了有什么好处| 为什么来完月经下面痒| 便秘吃什么菜有助排便| 肠梗阻有什么症状| 菩提是什么| 左下腹痛是什么原因| 微量元素检查挂什么科| 落井下石什么意思| 家族是什么意思| 在吗是什么意思| 肾虚是什么意思| 窥见是什么意思| 这个季节吃什么水果最好| beast什么意思| 肌无力吃什么药| 内衣什么品牌最好| 2028是什么年| 禾加末念什么| 什么填海| 检察院是做什么的| 原本是什么意思| 拉肚子可以吃什么菜| 什么带不能系| 宝五行属什么| 奇怪的什么| 小五行属性是什么| 不敢苟同是什么意思| 立秋是什么意思| 月经2天就没了什么原因| 小孩脚后跟疼是什么原因| 麻疹是什么病| 荷兰猪是什么动物| 做腹部彩超挂什么科| 低血糖吃什么最快缓解| 1969年属什么| 百度Jump to content

网文圈需要“西西弗斯式”维权吗

From Wikipedia, the free encyclopedia
百度 王东说,胰腺癌的主要症状包括:消化不良、恶心、黄疸、脂肪泻、疼痛和抑郁,以及不明原因的体重急速下降等,如有出现,应提高警惕。

In mathematics, low-rank approximation refers to the process of approximating a given matrix by a matrix of lower rank. More precisely, it is a minimization problem, in which the cost function measures the fit between a given matrix (the data) and an approximating matrix (the optimization variable), subject to a constraint that the approximating matrix has reduced rank. The problem is used for mathematical modeling and data compression. The rank constraint is related to a constraint on the complexity of a model that fits the data. In applications, often there are other constraints on the approximating matrix apart from the rank constraint, e.g., non-negativity and Hankel structure.

Low-rank approximation is closely related to numerous other techniques, including principal component analysis, factor analysis, total least squares, latent semantic analysis, orthogonal regression, and dynamic mode decomposition.

Definition

[edit]

Given

  • structure specification ,
  • vector of structure parameters ,
  • norm , and
  • desired rank ,

Applications

[edit]

Basic low-rank approximation problem

[edit]

The unstructured problem with fit measured by the Frobenius norm, i.e.,

has an analytic solution in terms of the singular value decomposition of the data matrix. The result is referred to as the matrix approximation lemma or Eckart–Young–Mirsky theorem. This problem was originally solved by Erhard Schmidt[1] in the infinite dimensional context of integral operators (although his methods easily generalize to arbitrary compact operators on Hilbert spaces) and later rediscovered by C. Eckart and G. Young.[2] L. Mirsky generalized the result to arbitrary unitarily invariant norms.[3] Let

be the singular value decomposition of , where , where , is the rectangular diagonal matrix with non-zero singular values . For a given , partition , , and as follows:

where is , is , and is . Then the rank- matrix, obtained from the truncated singular value decomposition

is such that

The minimizer is unique if and only if .

Proof of Eckart–Young–Mirsky theorem (for spectral norm)

[edit]

Let be a real (possibly rectangular) matrix with . Suppose that

is the singular value decomposition of . Recall that and are orthogonal matrices, and is an diagonal matrix with entries such that .

We claim that the best rank- approximation to in the spectral norm, denoted by , is given by

where and denote the th column of and , respectively.

First, note that we have

Therefore, we need to show that if where and have columns then .

Since has columns, then there must be a nontrivial linear combination of the first columns of , i.e.,

such that . Without loss of generality, we can scale so that or (equivalently) . Therefore,

The result follows by taking the square root of both sides of the above inequality.

Proof of Eckart–Young–Mirsky theorem (for Frobenius norm)

[edit]

Let be a real (possibly rectangular) matrix with . Suppose that

is the singular value decomposition of .

We claim that the best rank approximation to in the Frobenius norm, denoted by , is given by

where and denote the th column of and , respectively.

First, note that we have

Therefore, we need to show that if where and have columns then

By the triangle inequality with the spectral norm, if then . Suppose and respectively denote the rank approximation to and by SVD method described above. Then, for any

Since , when and we conclude that for

Therefore,

as required.

Weighted low-rank approximation problems

[edit]

The Frobenius norm weights uniformly all elements of the approximation error . Prior knowledge about distribution of the errors can be taken into account by considering the weighted low-rank approximation problem

where vectorizes the matrix column wise and is a given positive (semi)definite weight matrix.

The general weighted low-rank approximation problem does not admit an analytic solution in terms of the singular value decomposition and is solved by local optimization methods, which provide no guarantee that a globally optimal solution is found.

In case of uncorrelated weights, weighted low-rank approximation problem also can be formulated in this way:[4][5] for a non-negative matrix and a matrix we want to minimize over matrices, , of rank at most .

Entry-wise Lp low-rank approximation problems

[edit]

Let . For , the fastest algorithm runs in time.[6][7] One of the important ideas been used is called Oblivious Subspace Embedding (OSE), it is first proposed by Sarlos.[8]

For , it is known that this entry-wise L1 norm is more robust than the Frobenius norm in the presence of outliers and is indicated in models where Gaussian assumptions on the noise may not apply. It is natural to seek to minimize .[9] For and , there are some algorithms with provable guarantees.[10][11]

Distance low-rank approximation problem

[edit]

Let and be two point sets in an arbitrary metric space. Let represent the matrix where . Such distances matrices are commonly computed in software packages and have applications to learning image manifolds, handwriting recognition, and multi-dimensional unfolding. In an attempt to reduce their description size,[12][13] one can study low rank approximation of such matrices.

Distributed/Streaming low-rank approximation problem

[edit]

The low-rank approximation problems in the distributed and streaming setting has been considered in.[14]

Image and kernel representations of the rank constraints

[edit]

Using the equivalences

and

the weighted low-rank approximation problem becomes equivalent to the parameter optimization problems

and

where is the identity matrix of size .

Alternating projections algorithm

[edit]

The image representation of the rank constraint suggests a parameter optimization method in which the cost function is minimized alternatively over one of the variables ( or ) with the other one fixed. Although simultaneous minimization over both and is a difficult biconvex optimization problem, minimization over one of the variables alone is a linear least squares problem and can be solved globally and efficiently.

The resulting optimization algorithm (called alternating projections) is globally convergent with a linear convergence rate to a locally optimal solution of the weighted low-rank approximation problem. Starting value for the (or ) parameter should be given. The iteration is stopped when a user defined convergence condition is satisfied.

Matlab implementation of the alternating projections algorithm for weighted low-rank approximation:

function [dh, f] = wlra_ap(d, w, p, tol, maxiter)
[m, n] = size(d); r = size(p, 2); f = inf;
for i = 2:maxiter
    % minimization over L
    bp = kron(eye(n), p);
    vl = (bp' * w * bp) \ bp' * w * d(:);
    l  = reshape(vl, r, n);
    % minimization over P
    bl = kron(l', eye(m));
    vp = (bl' * w * bl) \ bl' * w * d(:);
    p  = reshape(vp, m, r);
    % check exit condition
    dh = p * l; dd = d - dh;
    f(i) = dd(:)' * w * dd(:);
    if abs(f(i - 1) - f(i)) < tol, break, end
endfor

Variable projections algorithm

[edit]

The alternating projections algorithm exploits the fact that the low rank approximation problem, parameterized in the image form, is bilinear in the variables or . The bilinear nature of the problem is effectively used in an alternative approach, called variable projections.[15]

Consider again the weighted low rank approximation problem, parameterized in the image form. Minimization with respect to the variable (a linear least squares problem) leads to the closed form expression of the approximation error as a function of

The original problem is therefore equivalent to the nonlinear least squares problem of minimizing with respect to . For this purpose standard optimization methods, e.g. the Levenberg-Marquardt algorithm can be used.

Matlab implementation of the variable projections algorithm for weighted low-rank approximation:

function [dh, f] = wlra_varpro(d, w, p, tol, maxiter)
prob = optimset(); prob.solver = 'lsqnonlin';
prob.options = optimset('MaxIter', maxiter, 'TolFun', tol); 
prob.x0 = p; prob.objective = @(p) cost_fun(p, d, w);
[p, f ] = lsqnonlin(prob); 
[f, vl] = cost_fun(p, d, w); 
dh = p * reshape(vl, size(p, 2), size(d, 2));

function [f, vl] = cost_fun(p, d, w)
bp = kron(eye(size(d, 2)), p);
vl = (bp' * w * bp) \ bp' * w * d(:);
f = d(:)' * w * (d(:) - bp * vl);

The variable projections approach can be applied also to low rank approximation problems parameterized in the kernel form. The method is effective when the number of eliminated variables is much larger than the number of optimization variables left at the stage of the nonlinear least squares minimization. Such problems occur in system identification, parameterized in the kernel form, where the eliminated variables are the approximating trajectory and the remaining variables are the model parameters. In the context of linear time-invariant systems, the elimination step is equivalent to Kalman smoothing.

A Variant: convex-restricted low rank approximation

[edit]

Usually, we want our new solution not only to be of low rank, but also satisfy other convex constraints due to application requirements. Our interested problem would be as follows,

This problem has many real world applications, including to recover a good solution from an inexact (semidefinite programming) relaxation. If additional constraint is linear, like we require all elements to be nonnegative, the problem is called structured low rank approximation.[16] The more general form is named convex-restricted low rank approximation.

This problem is helpful in solving many problems. However, it is challenging due to the combination of the convex and nonconvex (low-rank) constraints. Different techniques were developed based on different realizations of . However, the Alternating Direction Method of Multipliers (ADMM) can be applied to solve the nonconvex problem with convex objective function, rank constraints and other convex constraints,[17] and is thus suitable to solve our above problem. Moreover, unlike the general nonconvex problems, ADMM will guarantee to converge a feasible solution as long as its dual variable converges in the iterations.

See also

[edit]

References

[edit]
  1. ^ E. Schmidt, Zur Theorie der linearen und nichtlinearen Integralgleichungen, Math. Annalen 63 (1907), 433-476. doi:10.1007/BF01449770
  2. ^ C. Eckart, G. Young, The approximation of one matrix by another of lower rank. Psychometrika, Volume 1, 1936, Pages 211–8. doi:10.1007/BF02288367
  3. ^ L. Mirsky, Symmetric gauge functions and unitarily invariant norms, Q.J. Math. 11 (1960), 50-59. doi:10.1093/qmath/11.1.50
  4. ^ Srebro, Nathan; Jaakkola, Tommi (2003). Weighted Low-Rank Approximations (PDF). ICML'03.
  5. ^ Razenshteyn, Ilya; Song, Zhao; Woodruff, David P. (2016). Weighted Low Rank Approximations with Provable Guarantees. STOC '16 Proceedings of the forty-eighth annual ACM symposium on Theory of Computing.
  6. ^ Clarkson, Kenneth L.; Woodruff, David P. (2013). Low Rank Approximation and Regression in Input Sparsity Time. STOC '13 Proceedings of the forty-fifth annual ACM symposium on Theory of Computing. arXiv:1207.6365.
  7. ^ Nelson, Jelani; Nguyen, Huy L. (2013). OSNAP: Faster numerical linear algebra algorithms via sparser subspace embeddings. FOCS '13. arXiv:1211.1002.
  8. ^ Sarlos, Tamas (2006). Improved approximation algorithms for large matrices via random projections. FOCS'06.
  9. ^ Song, Zhao; Woodruff, David P.; Zhong, Peilin (2017). Low Rank Approximation with Entrywise L1-Norm Error. STOC '17 Proceedings of the forty-ninth annual ACM symposium on Theory of Computing. arXiv:1611.00898.
  10. ^ Bringmann, Karl; Kolev, Pavel; Woodruff, David P. (2017). Approximation Algorithms for L0-Low Rank Approximation. NIPS'17. arXiv:1710.11253.
  11. ^ Chierichetti, Flavio; Gollapudi, Sreenivas; Kumar, Ravi; Lattanzi, Silvio; Panigrahy, Rina; Woodruff, David P. (2017). Algorithms for Lp Low-Rank Approximation. ICML'17. arXiv:1705.06730.
  12. ^ Bakshi, Ainesh L.; Woodruff, David P. (2018). Sublinear Time Low-Rank Approximation of Distance Matrices. NeurIPS. arXiv:1809.06986.
  13. ^ Indyk, Piotr; Vakilian, Ali; Wagner, Tal; Woodruff, David P. (2019). Sample-Optimal Low-Rank Approximation of Distance Matrices. COLT.
  14. ^ Boutsidis, Christos; Woodruff, David P.; Zhong, Peilin (2016). Optimal Principal Component Analysis in Distributed and Streaming Models. STOC. arXiv:1504.06729.
  15. ^ G. Golub and V. Pereyra, Separable nonlinear least squares: the variable projection method and its applications, Institute of Physics, Inverse Problems, Volume 19, 2003, Pages 1-26.
  16. ^ Chu, Moody T.; Funderlic, Robert E.; Plemmons, Robert J. (2003). "structured low-rank approximation". Linear Algebra and Its Applications. 366: 157–172. doi:10.1016/S0024-3795(02)00505-0.
  17. ^ "A General System for Heuristic Solution of Convex Problems over Nonconvex Sets" (PDF).
  • M. T. Chu, R. E. Funderlic, R. J. Plemmons, Structured low-rank approximation, Linear Algebra and its Applications, Volume 366, 1 June 2003, Pages 157–172 doi:10.1016/S0024-3795(02)00505-0
[edit]
肠胃炎能吃什么 什么植物和动物最像鸡 分差是什么意思 叫什么 氯偏高是什么原因
解脲支原体阳性吃什么药 为什么北极没有企鹅 下午五点到七点是什么时辰 安徒生被誉为什么 双向是什么意思
窍是什么意思 羁什么意思 洗面奶什么牌子好 元阳是什么意思 痛风能吃什么鱼
dave是什么意思 鳞状上皮增生什么意思 别字是什么意思 你代表什么意思 高密度脂蛋白胆固醇偏高是什么意思
蚊子有什么用hcv8jop5ns0r.cn 正畸和矫正有什么区别youbangsi.com qs什么意思hcv7jop7ns1r.cn 什么颜色加什么颜色等于灰色hcv9jop6ns2r.cn 老枞水仙属于什么茶onlinewuye.com
黄芪可以和什么一起泡水喝hcv9jop3ns7r.cn 扁平足是什么样子图片hcv7jop4ns8r.cn 心电图挂什么科hcv9jop4ns6r.cn 手筋鼓起来是什么原因hcv8jop6ns8r.cn 甲申日是什么意思hcv9jop4ns1r.cn
44是什么意思clwhiglsz.com 陪葬是什么意思hcv8jop2ns6r.cn 护士一般什么学历hcv9jop5ns5r.cn t是什么hcv9jop6ns8r.cn 三文鱼刺身是什么意思hcv8jop0ns2r.cn
不让看朋友圈显示什么hcv7jop9ns1r.cn 美的不可方物什么意思hcv7jop6ns4r.cn 蜂蜜吃了有什么好处hcv8jop4ns5r.cn 女生下体瘙痒用什么药hcv7jop9ns7r.cn 精忠报国是什么意思0297y7.com
百度