Friday, June 12, 2009

To Err is Human

The day was filled with Errors, of the human and astronomical varieties.

Yesterday:

I went through my multi-plots some more, this time with a magnitude=24 cutoff. By looking at these brighter objects, we gain two major things:
-less uncertainty in measured magnitudes
-probably a more accurate classification (star vs gal)

We also then make the assumption that "stars are stars are stars", i.e. that the dimmer stars are fundamentally and structurally of the same nature that the bright ones are. Thus, by investigating the patterns of star sequences in brighter samples, we can glean info about those dimmer (meaning farther away or colder, in some cases). And, as Beth pointed out (which I surmised as well), this assumption would be ill advised in the realm of galaxies, as dimmer ones (farther away or not) we know don't all have the same structures as those closer to us; this is due to the highly varied morphological properties of galaxies, as well as their higher redshift, and the phenomenon of "looking back in time" at younger galaxies, as opposed to our closer, more evolved galactic structures.

Once implementing the higher magnitude cutoff, I made my multi-plots, showing different layers of brighter objects and color-coded stars vs galaxies. **I did successfully show the separation of the brightest stars in a pretty sequence. Have yet to attempt this with an i-z color-color plot.**

And by the way, this was all on the larger set of processors, squid! Which is actually why this took me a little longer than it should have, because I working in a less familiar environment. But I'm pretty acclimated to it now, even though I did prefer eel's IDL development environment. The switch became more immediately necessary because poor little eel and his one processor could no longer handle the loads I was giving him with the multiple plot procedures.

Today:

Began the morning by going through and making up a few quick multi-plots (in the same manner as yesterday), corresponding to the southern field, as I worked only with the northern half prior.

Then came the Reign of Error! ;P
I had started a code yesterday evening, setting up to make my plots of magnitude and color uncertainties, in the style I found as I read Dylan's research. This plotting got off to a hazardous start, and only got worse.

First tackled was a simple plot of B measurement error vs. B magnitude. This looked relatively as expected, once axis parameters were implemented. So, move don to a B-V error vs. B magnitude, and got a not nice surprise: numerous points appeared below the curve of the estimated minimum errors, in streaks towards the x-axis. Beth and I puzzled about these odd errors for a while, doing several sanity-checks on my data and making sure my code was indeed debugged, and it seemed to me that the only way they could have appeared was if the B magnitude was somehow smaller than V. Then it dawned on me, that some of the B errors were showing up as 0's!

Investigating a small portion of these points, I found that most of them were classified as galaxies (like 90% between 0-0.04). I also went to on examine them in the largest aperture, to see how the measurements and errors compared. Using a my new favorite IDL toy, the multi-plot, I set up a comparison between the two uncertainty plots in each aperture, but have been able to determine little from them other than that this odd phenomenon occurs in the same manner in both.

Needing further direction for the afternoon before Beth left, she started me also thinking about and working on taking a median of the plots. Using mostly the online tutorial from Astro333, and a little from an IDL book, I got the code written out to my satisfaction, and ran it with high hopes. These were soon torn down, because the code couldn't even get as far as the forloop. De-bugging proceeded for hours (yes, multiple hours *exhaustion*), as bit by bit, as each line was altered. And sanity-checked. And thoroughly examined. Repeatedly (thanks to Gail for helping me through this frustrating and grueling process). One error message at a time was sifted through.

Finally, we got to the bottom of the problem, and I solved the forloop's issues by adding some limits on the range of the data set. This stemmed from the realizations that some of my data are funky, very dim, have crazy filler readings of 99's, or there are simply holes where there are no data points at some magnitudes on the brighter end of the spectrum.

Then, luckily, without another error, I got it to give me the median values I sought. And then, to my dismay, when I tried to oplot it, it failed, claiming that it wasn't an array. Which I have yet to figure out, because I told it that it was an array, and I can't figure out why I doesn't know that... But at this point I am going to need to ask for assistance from a fresh pair of eyes, because I've been working on this code for too long for me to see anything wrong popping out.

So, for Monday:

-Finish debugging this code and get a pretty median uncertainty line.
-Meet with Beth about my work and set official goals for where I'm going with my project for the rest of the summer.

Also on my to-do list:
-RANDOMU!!
-Error plots for other bands and colors
-more reading

2 comments:

  1. Debugging is good for the soul. I'm glad you had a rough debugging session.. even though I'm sure it was torturous. Lets look at your code together first thing Monday morning. If you want more instant gratification, copy and paste your code into an email to me, along with the error it crashes on, and I can help remotely!

    ReplyDelete
  2. p.s. You guys are too funny in your blogs.

    ReplyDelete