充盈是什么意思| 独家记忆是什么意思| 不服是什么意思| 大红袍茶属于什么茶| 毛周角化症用什么药膏| 母后是什么意思| 什么古迹| 纱布是什么材质| 痈肿疮疖是什么意思| 1月2日是什么星座| 外阴红肿疼痛用什么药| 反流性食管炎吃什么中成药最好| 愚孝什么意思| 95年猪是什么命| 浑身发抖是什么原因| 小孩子坐飞机需要什么证件| 他们吃什么| 怀孕可以吃什么水果| 皮卡丘站起来变成了什么| 大米发霉是什么样子| 老司机是什么意思| 头昏和头晕有什么区别| 胃疼是什么感觉| 买什么保险最实用| 手脚出汗是什么原因| 扁平疣是什么| 尖牙什么时候换| 梦见补的牙齿掉了是什么意思| 吃什么升白细胞| 甲状腺检查挂什么科| 女人总犯困是什么原因| wh是什么颜色| 吃了龙虾后不能吃什么| 吃什么可以养肝| 胎盘2级是什么意思| 木加一笔有什么字| 尿特别多是什么原因| 喝啤酒有什么好处| 抗核抗体是什么意思| 24是什么生肖| 阴道口出血是什么原因| 养胃喝什么茶| 为什么不建议割鼻息肉| 关节错缝术是什么意思| 什么有洞天| 脸上长斑是什么原因引起的| hb医学上是什么意思| 蚊香灰有什么作用| 春宵一刻值千金是什么意思| 欢是什么动物| 95511是什么电话| 吃什么对肺结节好| 熬夜对心脏有什么影响| 伤口拆线挂什么科| 脑疝是什么原因引起的| 汽化是什么意思| 鳞状上皮乳头状瘤是什么| 妍五行属性是什么| 小月子吃什么水果| bg是什么意思| OD是什么| 自强不息的息是什么意思| 性生活频繁有什么危害| 空调出的水是什么水| 柳条像什么| 胃病是什么原因引起的| mac代表什么| 孩子不长个子是什么原因| 学护理需要什么条件| venes保温杯是什么品牌| edo是什么意思| 湿邪是什么意思| 甲状腺是什么科| 木薯是什么东西| 花生什么时候成熟| 肚子胀恶心想吐是什么原因| 紫得什么| 吃什么消炎药可以喝酒| 莫逆之交什么意思| 女人喝黄连有什么好处| 女性憋不住尿是什么原因| 柠檬什么时候开花结果| 中暑是什么感觉| 血稠是什么原因引起的| 黄体破裂是什么原因| iga肾病是什么病| 无力感什么意思| 幼儿急疹为什么不能碰水| 起眼屎是什么原因| 孙膑原名叫什么| 疖肿什么意思| 剁椒鱼头属于什么菜系| 老鹰代表什么生肖| 1908年中国发生了什么| 跟泰迪很像的狗叫什么| 手指尖发麻是什么原因| 免疫抑制剂是什么意思| 送终是什么意思| 257什么意思| 元胡是什么| 君子兰叶子发黄是什么原因| 乳腺无回声结节是什么意思| 什么叫一桌餐| 吃毛蛋有什么好处| 指压板有什么功效| 荔枝和什么吃会中毒| 性生活是什么意思| 右后背疼什么原因| 熟褐色是什么颜色| 世界上笔画最多的字是什么| 睡前一杯牛奶有什么好处| 1.17是什么星座| 女生额头长痘痘是什么原因| 脚底干裂起硬皮是什么原因怎么治| 丁亥日五行属什么| 逸搏心律什么意思| 数字7的风水含义是什么| 梦见棺材是什么征兆| owl是什么意思| 煜怎么读音是什么意思| 1870年是什么朝代| 口舌麻木吃什么药| 2017什么年| 害喜是什么意思| 舌苔紫色是什么原因| 反手引体向上练什么肌肉| 洋葱对肝脏有什么好处| 1953属什么生肖| 腔隙灶是什么意思| 梦见自己买衣服是什么意思| 小孩个子矮小吃什么促进生长发育| 发粉是什么| 小学什么时候放假| 土豆淀粉能做什么美食| 用牛奶敷脸有什么好处和坏处| 吃什么推迟月经| 阳历7月15日是什么星座| 世界上最毒的蛇是什么蛇| b群链球菌是什么意思| 为什么北方人比南方人高| 鼻子毛白了是什么原因| 手脱皮缺什么维生素| 什么杯子不能装水| 什么发色显皮肤白| 孕妇为什么不能吃山楂| 献血浆为什么会给钱| 什么是性冷淡| 勰读什么| 碱性体质的人有什么特征| 五行缺土是什么意思| 两弹一星是什么| 忠实是什么意思| 兔子能吃什么| 夫妻都是b型血孩子是什么血型| 化疗之后吃什么好| 焯水是什么意思| 掉头发吃什么药最有效| 家里为什么有蜈蚣| 钝感力什么意思| 扁肉是什么| 草字头加个弓念什么| 胃疼吃什么食物对胃好| 刘备的马叫什么| 越五行属什么| 催供香是什么意思| 什么的小朋友填词语| 宫颈糜烂用什么药最好| 梦见朋友结婚是什么意思| 三月四号什么星座| 梦见水代表什么| 什么叫四大皆空| 中国的国树是什么| exp是什么函数| 天蝎座男生喜欢什么样的女生| 郑州机场叫什么名字| 鸡炖什么补气血| 什么西瓜最好吃| 全腹部ct平扫主要检查什么| 磨玻璃影是什么意思| 现在当兵需要什么条件| 腋下异味看什么科| 兆以上的计数单位是什么| 什么是皮包公司| 鸿雁是什么意思| 眼睛有点黄是什么原因| 肝气郁结喝什么茶| 小米不能和什么一起吃| 怀孕初期吃什么对胎儿发育好| 原汤化原食什么意思| 四川人喜欢吃什么| 七月七日是什么生肖| 命根子是什么| 元宵节送什么| 蛇缠腰是什么病怎么治| 背影杀是什么意思| 秃噜皮是什么意思| 女性尿路感染用什么药| 6月20日是什么日子| 赟读什么| 眉毛痒痒代表什么预兆| 屎忽鬼是什么意思| 三千年前是什么朝代| 下面出血是什么原因| 出圈什么意思| 神经性皮炎是什么原因引起的| 74年属什么的生肖| 满月是什么时候| 鞋子eur是什么意思| 女人为什么会来月经| 情调是什么意思| 包罗万象是什么意思| 入职需要准备什么材料| 蜂王浆什么味道| 经期喝什么汤| 坏血症什么症状| 红曲红是什么| 头部出汗多吃什么药| 咽炎有什么症状| 肺部积水是什么原因引起的| 人工牛黄是什么| 发粉是什么| 尿酸高吃什么食物最好| 梦是什么| 离岗是什么意思| 早晨起床口苦是什么原因| 阴道瘙痒吃什么药| 为什么会有痣| 一带一路指的是什么| 甲醛闻多了有什么症状| 精忠报国是什么意思| 表挂在客厅什么位置好| 观音坐莲是什么姿势| 月经一直不干净是什么原因引起的| 孩子贫血吃什么补血最快| 人间炼狱是什么意思| 什么颜什么色| 梦见自己娶媳妇是什么意思| 直接胆红素偏高是什么原因| 梦见被蛇咬是什么意思| 女性腰疼去医院挂什么科| bonnie是什么意思| 经常吃南瓜有什么好处和坏处| 猫鼻支什么症状| 预约转账什么时候到账| 行驶证和驾驶证有什么区别| 十指连心是什么意思| 吃止痛药有什么副作用| 子鱼是什么鱼| 胎儿宫内窘迫是什么意思| 阻断是什么意思| 痈肿疮疖是什么意思| 低密度脂蛋白胆固醇是什么意思| 用脚尖走路有什么好处| 求婚什么意思| winner是什么意思| 医保定点是什么意思| 低盐饮食有利于预防什么| 半月板变性是什么意思| 白内障吃什么药| 女人银屑病一般都长什么地方| TOYOTA是什么车| 偶尔是什么意思| 为什么手臂上有很多很小的点| 肝病看什么科室| 百度Jump to content

半夜惊醒是什么原因

From Wikipedia, the free encyclopedia
百度   据中国移动政企分公司交通行业解决方案部总经理严茂胜介绍,本次发布的四款产品是基于“和路通“前两代用户的需求进行升级。

In mathematics, a block matrix or a partitioned matrix is a matrix that is interpreted as having been broken into sections called blocks or submatrices.[1][2]

Intuitively, a matrix interpreted as a block matrix can be visualized as the original matrix with a collection of horizontal and vertical lines, which break it up, or partition it, into a collection of smaller matrices.[3][2] For example, the 3x4 matrix presented below is divided by horizontal and vertical lines into four blocks: the top-left 2x3 block, the top-right 2x1 block, the bottom-left 1x3 block, and the bottom-right 1x1 block.

Any matrix may be interpreted as a block matrix in one or more ways, with each interpretation defined by how its rows and columns are partitioned.

This notion can be made more precise for an by matrix by partitioning into a collection , and then partitioning into a collection . The original matrix is then considered as the "total" of these groups, in the sense that the entry of the original matrix corresponds in a 1-to-1 way with some offset entry of some , where and .[4]

Block matrix algebra arises in general from biproducts in categories of matrices.[5]

A 168×168 element block matrix with 12×12, 12×24, 24×12, and 24×24 sub-matrices. Non-zero elements are in blue, zero elements are grayed.

Example

[edit]

The matrix

can be visualized as divided into four blocks, as

.

The horizontal and vertical lines have no special mathematical meaning,[6][7] but are a common way to visualize a partition.[6][7] By this partition, is partitioned into four 2×2 blocks, as

The partitioned matrix can then be written as

[8]

Formal definition

[edit]

Let . A partitioning of is a representation of in the form

,

where are contiguous submatrices, , and .[9] The elements of the partition are called blocks.[9]

By this definition, the blocks in any one column must all have the same number of columns.[9] Similarly, the blocks in any one row must have the same number of rows.[9]

Partitioning methods

[edit]

A matrix can be partitioned in many ways.[9] For example, a matrix is said to be partitioned by columns if it is written as

,

where is the th column of .[9] A matrix can also be partitioned by rows:

,

where is the th row of .[9]

Common partitions

[edit]

Often,[9] we encounter the 2x2 partition

,[9]

particularly in the form where is a scalar:

.[9]

Block matrix operations

[edit]

Transpose

[edit]

Let

where . (This matrix will be reused in § Addition and § Multiplication.) Then its transpose is

,[9][10]

and the same equation holds with the transpose replaced by the conjugate transpose.[9]

Block transpose

[edit]

A special form of matrix transpose can also be defined for block matrices, where individual blocks are reordered but not transposed. Let be a block matrix with blocks , the block transpose of is the block matrix with blocks .[11] As with the conventional trace operator, the block transpose is a linear mapping such that .[10] However, in general the property does not hold unless the blocks of and commute.

Addition

[edit]

Let

,

where , and let be the matrix defined in § Transpose. (This matrix will be reused in § Multiplication.) Then if , , , and , then

.[9]

Multiplication

[edit]

It is possible to use a block partitioned matrix product that involves only algebra on submatrices of the factors. The partitioning of the factors is not arbitrary, however, and requires "conformable partitions"[12] between two matrices and such that all submatrix products that will be used are defined.[13]

Two matrices and are said to be partitioned conformally for the product , when and are partitioned into submatrices and if the multiplication is carried out treating the submatrices as if they are scalars, but keeping the order, and when all products and sums of submatrices involved are defined.

—?Arak M. Mathai and Hans J. Haubold, Linear Algebra: A Course for Physicists and Engineers[14]

Let be the matrix defined in § Transpose, and let be the matrix defined in § Addition. Then the matrix product

can be performed blockwise, yielding as an matrix. The matrices in the resulting matrix are calculated by multiplying:

[6]

Or, using the Einstein notation that implicitly sums over repeated indices:

Depicting as a matrix, we have

.[9]

Inversion

[edit]

If a matrix is partitioned into four blocks, it can be inverted blockwise as follows:

where A and D are square blocks of arbitrary size, and B and C are conformable with them for partitioning. Furthermore, A and the Schur complement of A in P: P/A = D ? CA?1B must be invertible.[15]

Equivalently, by permuting the blocks:

[16]

Here, D and the Schur complement of D in P: P/D = A ? BD?1C must be invertible.

If A and D are both invertible, then:

By the Weinstein–Aronszajn identity, one of the two matrices in the block-diagonal matrix is invertible exactly when the other is.

Computing submatrix inverses from the full inverse

[edit]

By the symmetry between a matrix and its inverse in the block inversion formula, if a matrix P and its inverse P?1 are partitioned conformally:

then the inverse of any principal submatrix can be computed from the corresponding blocks of P?1:

This relationship follows from recognizing that E?1 = A ? BD?1C (the Schur complement), and applying the same block inversion formula with the roles of P and P?1 reversed.[17] [18]

Determinant

[edit]

The formula for the determinant of a -matrix above continues to hold, under appropriate further assumptions, for a matrix composed of four submatrices with and square. The easiest such formula, which can be proven using either the Leibniz formula or a factorization involving the Schur complement, is

[16]

Using this formula, we can derive that characteristic polynomials of and are same and equal to the product of characteristic polynomials of and . Furthermore, If or is diagonalizable, then and are diagonalizable too. The converse is false; simply check .

If is invertible, one has

[16]

and if is invertible, one has

[19][16]

If the blocks are square matrices of the same size further formulas hold. For example, if and commute (i.e., ), then

[20]

Similar statements hold when , , or ??. Namely, if , then

Note the change in order of and (we have instead of ). Similarly, if , then should be replaced with (i.e. we get ) and if , then we should have . Note for the last two results, you have to use commutativity of the underlying ring, but not for the first two.

This formula has been generalized to matrices composed of more than blocks, again under appropriate commutativity conditions among the individual blocks.[21]

For and , the following formula holds (even if and do not commute)

[16]

Special types of block matrices

[edit]

Direct sums and block diagonal matrices

[edit]

Direct sum

[edit]

For any arbitrary matrices A (of size m × n) and B (of size p × q), we have the direct sum of A and B, denoted by A  B and defined as

[10]

For instance,

This operation generalizes naturally to arbitrary dimensioned arrays (provided that A and B have the same number of dimensions).

Note that any element in the direct sum of two vector spaces of matrices could be represented as a direct sum of two matrices.

Block diagonal matrices

[edit]

A block diagonal matrix is a block matrix that is a square matrix such that the main-diagonal blocks are square matrices and all off-diagonal blocks are zero matrices.[16] That is, a block diagonal matrix A has the form

where Ak is a square matrix for all k = 1, ..., n. In other words, matrix A is the direct sum of A1, ..., An.[16] It can also be indicated as A1 ⊕ A2 ⊕ ... ⊕ An[10] or diag(A1, A2, ..., An)[10] (the latter being the same formalism used for a diagonal matrix). Any square matrix can trivially be considered a block diagonal matrix with only one block.

For the determinant and trace, the following properties hold:

[22][23] and
[16][23]

A block diagonal matrix is invertible if and only if each of its main-diagonal blocks are invertible, and in this case its inverse is another block diagonal matrix given by

[24]

The eigenvalues[25] and eigenvectors of are simply those of the s combined.[23]

Block tridiagonal matrices

[edit]

A block tridiagonal matrix is another special block matrix, which is just like the block diagonal matrix a square matrix, having square matrices (blocks) in the lower diagonal, main diagonal and upper diagonal, with all other blocks being zero matrices. It is essentially a tridiagonal matrix but has submatrices in places of scalars. A block tridiagonal matrix has the form

where , and are square sub-matrices of the lower, main and upper diagonal respectively.[26][27]

Block tridiagonal matrices are often encountered in numerical solutions of engineering problems (e.g., computational fluid dynamics). Optimized numerical methods for LU factorization are available[28] and hence efficient solution algorithms for equation systems with a block tridiagonal matrix as coefficient matrix. The Thomas algorithm, used for efficient solution of equation systems involving a tridiagonal matrix can also be applied using matrix operations to block tridiagonal matrices (see also Block LU decomposition).

Block triangular matrices

[edit]

Upper block triangular

[edit]

A matrix is upper block triangular (or block upper triangular[29]) if

,

where for all .[25][29]

Lower block triangular

[edit]

A matrix is lower block triangular if

,

where for all .[25]

Block Toeplitz matrices

[edit]

A block Toeplitz matrix is another special block matrix, which contains blocks that are repeated down the diagonals of the matrix, as a Toeplitz matrix has elements repeated down the diagonal.

A matrix is block Toeplitz if for all , that is,

,

where .[25]

Block Hankel matrices

[edit]

A matrix is block Hankel if for all , that is,

,

where .[25]

See also

[edit]
  • Kronecker product (matrix direct product resulting in a block matrix)
  • Jordan normal form (canonical form of a linear operator on a finite-dimensional complex vector space)
  • Strassen algorithm (algorithm for matrix multiplication that is faster than the conventional matrix multiplication algorithm)

Notes

[edit]
  1. ^ Eves, Howard (1980). Elementary Matrix Theory (reprint ed.). New York: Dover. p. 37. ISBN 0-486-63946-0. Retrieved 24 April 2013. We shall find that it is sometimes convenient to subdivide a matrix into rectangular blocks of elements. This leads us to consider so-called partitioned, or block, matrices.
  2. ^ a b Dobrushkin, Vladimir. "Partition Matrices". Linear Algebra with Mathematica. Retrieved 2025-08-07.
  3. ^ Anton, Howard (1994). Elementary Linear Algebra (7th ed.). New York: John Wiley. p. 30. ISBN 0-471-58742-7. A matrix can be subdivided or partitioned into smaller matrices by inserting horizontal and vertical rules between selected rows and columns.
  4. ^ Indhumathi, D.; Sarala, S. (2025-08-07). "Fragment Analysis and Test Case Generation using F-Measure for Adaptive Random Testing and Partitioned Block based Adaptive Random Testing" (PDF). International Journal of Computer Applications. 93 (6): 13. Bibcode:2014IJCA...93f..11I. doi:10.5120/16218-5662.
  5. ^ Macedo, H.D.; Oliveira, J.N. (2013). "Typing linear algebra: A biproduct-oriented approach". Science of Computer Programming. 78 (11): 2160–2191. arXiv:1312.4818. doi:10.1016/j.scico.2012.07.012.
  6. ^ a b c Johnston, Nathaniel (2021). Introduction to linear and matrix algebra. Cham, Switzerland: Springer Nature. pp. 30, 425. ISBN 978-3-030-52811-9.
  7. ^ a b Johnston, Nathaniel (2021). Advanced linear and matrix algebra. Cham, Switzerland: Springer Nature. p. 298. ISBN 978-3-030-52814-0.
  8. ^ Jeffrey, Alan (2010). Matrix operations for engineers and scientists: an essential guide in linear algebra. Dordrecht [Netherlands] ; New York: Springer. p. 54. ISBN 978-90-481-9273-1. OCLC 639165077.
  9. ^ a b c d e f g h i j k l m n Stewart, Gilbert W. (1998). Matrix algorithms. 1: Basic decompositions. Philadelphia, PA: Soc. for Industrial and Applied Mathematics. pp. 18–20. ISBN 978-0-89871-414-2.
  10. ^ a b c d e Gentle, James E. (2007). Matrix Algebra: Theory, Computations, and Applications in Statistics. Springer Texts in Statistics. New York, NY: Springer New York Springer e-books. pp. 47, 487. ISBN 978-0-387-70873-7.
  11. ^ Mackey, D. Steven (2006). Structured linearizations for matrix polynomials (PDF) (Thesis). University of Manchester. ISSN 1749-9097. OCLC 930686781.
  12. ^ Eves, Howard (1980). Elementary Matrix Theory (reprint ed.). New York: Dover. p. 37. ISBN 0-486-63946-0. Retrieved 24 April 2013. A partitioning as in Theorem 1.9.4 is called a conformable partition of A and B.
  13. ^ Anton, Howard (1994). Elementary Linear Algebra (7th ed.). New York: John Wiley. p. 36. ISBN 0-471-58742-7. ...provided the sizes of the submatrices of A and B are such that the indicated operations can be performed.
  14. ^ Mathai, Arakaparampil M.; Haubold, Hans J. (2017). Linear Algebra: a course for physicists and engineers. De Gruyter textbook. Berlin Boston: De Gruyter. p. 162. ISBN 978-3-11-056259-0.
  15. ^ Bernstein, Dennis (2005). Matrix Mathematics. Princeton University Press. p. 44. ISBN 0-691-11802-7.
  16. ^ a b c d e f g h Abadir, Karim M.; Magnus, Jan R. (2005). Matrix Algebra. Cambridge University Press. pp. 97, 100, 106, 111, 114, 118. ISBN 9781139443647.
  17. ^ "Is this formula for a matrix block inverse in terms of the entire matrix inverse known?". MathOverflow.
  18. ^ Escalante-B., Alberto N.; Wiskott, Laurenz (2016). "Improved graph-based SFA: Information preservation complements the slowness principle". Machine Learning. arXiv:1412.4679. doi:10.1007/s10994-016-5563-y.
  19. ^ Taboga, Marco (2021). "Determinant of a block matrix", Lectures on matrix algebra.
  20. ^ Silvester, J. R. (2000). "Determinants of Block Matrices" (PDF). Math. Gaz. 84 (501): 460–467. doi:10.2307/3620776. JSTOR 3620776. Archived from the original (PDF) on 2025-08-07. Retrieved 2025-08-07.
  21. ^ Sothanaphan, Nat (January 2017). "Determinants of block matrices with noncommuting blocks". Linear Algebra and Its Applications. 512: 202–218. arXiv:1805.06027. doi:10.1016/j.laa.2016.10.004. S2CID 119272194.
  22. ^ Quarteroni, Alfio; Sacco, Riccardo; Saleri, Fausto (2000). Numerical mathematics. Texts in applied mathematics. New York: Springer. pp. 10, 13. ISBN 978-0-387-98959-4.
  23. ^ a b c George, Raju K.; Ajayakumar, Abhijith (2024). "A Course in Linear Algebra". University Texts in the Mathematical Sciences: 35, 407. doi:10.1007/978-981-99-8680-4. ISBN 978-981-99-8679-8. ISSN 2731-9318.
  24. ^ Prince, Simon J. D. (2012). Computer vision: models, learning, and inference. New York: Cambridge university press. p. 531. ISBN 978-1-107-01179-3.
  25. ^ a b c d e Bernstein, Dennis S. (2009). Matrix mathematics: theory, facts, and formulas (2 ed.). Princeton, NJ: Princeton University Press. pp. 168, 298. ISBN 978-0-691-14039-1.
  26. ^ Dietl, Guido K. E. (2007). Linear estimation and detection in Krylov subspaces. Foundations in signal processing, communications and networking. Berlin ; New York: Springer. pp. 85, 87. ISBN 978-3-540-68478-7. OCLC 85898525.
  27. ^ Horn, Roger A.; Johnson, Charles R. (2017). Matrix analysis (Second edition, corrected reprint ed.). New York, NY: Cambridge University Press. p. 36. ISBN 978-0-521-83940-2.
  28. ^ Datta, Biswa Nath (2010). Numerical linear algebra and applications (2 ed.). Philadelphia, Pa: SIAM. p. 168. ISBN 978-0-89871-685-6.
  29. ^ a b Stewart, Gilbert W. (2001). Matrix algorithms. 2: Eigensystems. Philadelphia, Pa: Soc. for Industrial and Applied Mathematics. p. 5. ISBN 978-0-89871-503-3.

References

[edit]
儿童手指头脱皮什么原因引起的 金鱼可以和什么鱼混养 贻笑大方是什么意思 吃胡萝卜有什么好处 鲜卑人是现在的什么人
什么是鼻窦炎 眼下长斑是什么原因 冰心原名叫什么名字 无印良品属于什么档次 未曾谋面什么意思
女生纹身什么图案好看 开车什么意思 穷途末路什么意思 牙医需要什么学历 血脂高看什么科
女人为什么会宫外怀孕 脂肪肝能吃什么水果 护理和护士有什么区别 平板支撑练什么 刚怀孕要吃些什么好
乳房结节是什么原因引起的hcv8jop5ns7r.cn 坐月子能吃什么零食hcv9jop6ns6r.cn 诺帝卡是什么档次xjhesheng.com 伤到什么程度打破伤风hcv8jop6ns8r.cn 骨折吃什么药好得快hcv8jop5ns5r.cn
六一年属什么生肖hcv9jop0ns4r.cn 什么叫人hcv8jop4ns4r.cn 鼻子流清水是什么原因hcv8jop9ns4r.cn 息斯敏是什么药hcv8jop7ns5r.cn 胆囊炎有什么症状表现hcv7jop7ns2r.cn
什么狗不掉毛适合家养bjhyzcsm.com 健身后应该吃什么bysq.com 白酒不能和什么一起吃hcv8jop9ns2r.cn 飞检是什么意思hcv9jop6ns3r.cn 为什么玉镯不能戴左手hcv9jop5ns3r.cn
准备好了吗时刻准备着是什么歌bjhyzcsm.com 仓鼠吃什么食物最好hcv7jop4ns6r.cn 怀孕有什么症状96micro.com 家庭出身是什么0735v.com 甲功三项查的是什么hcv8jop7ns7r.cn
百度