Skip to content

PSDP Problem

Definitions

PSDP:minPSn||FPG||
SP:minSSn||FSG||

Here, F,GR n×m.

Background

  • S~ denotes a minimizer for the SP problem.
  • P~ denotes a minimizer for the PSDP problem.
  • S~GG+GGS~=FG+GF=Q
  • When rank(G)=n, a sufficient condition for S~ to be in Sn is that QSn.
  • A sufficient condition for S~ to be in Sn is that FGSm.
  • rank(G)=n is also necessary and sufficient for uniqueness of minimizers in case of PSDP.

Previous Approches

  • The objective function and the feasible set in case of PSDP are both convex.
  • We can use convex optimization which directly exploits the convexity of Sn.
  • Solutions involve finite convergence towards global minimas bringing the approximate minimizer arbitrarily close to the unique global minimizer.

Approach of this paper

PSDP:minERn×n||FEEG||
P~global=E~localE~local

Algorithm ideas:

  • Steepest descent (doesn’t converge, proven)
  • Newton’s algorithm (modified, superior than known convex programs of that time)
  • Where do we obtain the data? Take stuff, perturb by errors to make sure that S~P~.

Newton’s method:

xk+1=xkf(xk)f(xk)
  • Has quadratic convergence.
  • Can find complex zeroes.
  • If you have n variables, then n2 operations is needed to find 2f (Hessian) and n3 operations to find the inverse of the Hessian.

Quasi Newtown method:

GD:    xk+1=xkηIf(xk)
NM:    xk+1=xkη12f(xk)f(xk)
QNM:    xk+1=xkηHkf(xk)

A good choice for H can be the following. This is the BFGS method and Hk holds curvature information. Key idea is to match curvature along trajectory.

H=[11]