POV-Ray : Newsgroups : povray.off-topic : ATTN: Math-Heads : Re: ATTN: Math-Heads Server Time
6 Sep 2024 05:14:11 EDT (-0400)
  Re: ATTN: Math-Heads  
From: Kevin Wampler
Date: 15 Feb 2009 13:17:13
Message: <49985c29$1@news.povray.org>
John VanSickle wrote:
> Here's the sad part.  At the moment, to make ends meet, I work the night 
> shift at a local store.  I wrote the piece on a laptop during my lunch 
> break.  While I was writing I thought, "Does anyone who understands this 
> belong here?"

I'm sorry to hear that.  Hopefully to work at the store is still 
enjoyable -- I know a few people with graduate degrees who've 
(voluntarily) taken time off from computers to work retail, and often 
they've had a decent time of it.  Best of luck in finding something more 
profitable quickly all the same.

I didn't notice any errors in your math, although I didn't read through 
in great detail, so I could have missed something.  I'm more used to 
seeing the method presented in pure matrix form as minimizing the L2 
norm of Ax-b, but your approach works well for the way you've formulated 
the problem.  Perhaps the title might be better phrased as "Least 
Squares Regression" rather than "Least Squares Method" to better 
highlight this difference?  Certainly a minor point at best though.

The main thing I can think of which you might (or might not) want to 
mention is that it's often not a bad idea to use a third-party solver, 
rather than constructing and inverting the A^T*A yourself, and that 
depending on the application a useful solution can still sometimes be 
obtained when the matrix inverse is zero, but that it's just not unique.

Overall I liked it, and thought it was a good, concise, and 
straightforward description.


Post a reply to this message

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.