Figured I had to post this... I've been trying to compile command-line vim 7.3 on Mac OS X 10.7. I have the latest `hg clone`d version of vim. I'm stuck on ncurses. If I `./configure` with no options, I get the following error: checking --with-tlib argument... empty: automatic terminal library selection checking for tgetent in -ltinfo... no checking for tgetent in -lncurses... no checking for tgetent in -ltermlib... no checking for tgetent in -ltermcap... no checking for tgetent in -lcurses... no no terminal library found checking for tgetent()... configure: error: NOT FOUND! You need to install a terminal library; for example ncurses. Or specify the name of the library with --with-tlib. If instead I try `./configure --with-tlib=ncurses` checking --with-tlib argument... ncurses checking for linking with ncurses library... configure: error: FAILED I have Xcode 4.1. As far as I can tell, ncurses is available: $ file /usr/lib/libncurses.* /usr/lib/libncurses.5.4.dylib: Mach-O universal binary with 2 architectures /usr/lib/libncurses.5.4.dylib (for architecture x86_64): Mach-O 64-bit dynamically linked shared library x86_64 /usr/lib/libncurses.5.4.dylib (for architecture i386): Mach-O dynamically linked shared library i386 /usr/lib/libncurses.5.dylib: Mach-O dynamically linked shared library i386 /usr/lib/libncurses.dylib: Mach-O universal binary with 2 architectures /usr/lib/libncurses.dylib (for architecture x86_64): Mach-O 64-bit dynamically linked shared library x86_64 /usr/lib/libncurses.dylib (for architecture i386): Mach-O dynamically linked shared library i386 Then I changed my PATH from /usr/local/bin... to /usr/bin..... The problem was trying to use my /usr/local/bin/gcc instead of the mac default /usr/bin/gcc. Something about my locally installed gcc (4.6.1) caused major problems. I also eventually had to do this command: LDFLAGS=-L/usr/lib CFLAGS='-arch i386 -arch x86_64' CCFLAGS='-arch i386 -arch x86_64' CXXFLAGS='-arch i386 -arch x86_64' ./configure --enable-perlinterp --enable-pythoninterp --enable-cscope --with-features=huge and then had to make sure my default python was NOT pointing to enthought!
Comps 2 reflections
Comps 2 included some successes and some failures. The most successful part of my Comps preparation was the Monday talk. The previous talk on Tuesday was somewhat helpful in terms of realizing that I needed larger figure axes, but otherwise provided no useful feedback. The Monday talk allowed me to realize what needed to be done to make my talk accessible to a larger audience. At the defense, I ended up going only ~40 minutes despite having gone far over time in the Monday version and spending ~5 minutes answering questions from Don and Mike. I think that was a good thing; I didn't need to say anything more even though there was an enormous amount of additional material I could have covered. The main change I made from Monday to Friday was reorganizing such that I discussed the largest scales first and zoomed in, and I spent much more time discussing the larger context of my work. Unfortunately, I also spent most of the week before the presentation determining the larger context and reading papers. Ideally, I would have done that before handing in the paper. The closed door Q&A section went OK but not great. There were a few important bits of information related to the IMF that I didn't know off the top of my head - e.g. the ratio of total # of stars to the # of B stars. I got the lowest mass star (.07) confused with the most common star mass (.3). I wasn't particularly able to integrated the IMF on the board either. I didn't remember the Jeans mass-temperature and mass-density relationships but was able to derive them quickly enough. Probably the biggest problem was dealing with a question about the partition function - specifically how did the partition function come into play in the column density equation. I didn't come up with the right answer at all, and in particular quoted the wrong distribution. However, I think a big part of what they expected to hear was a dependence on temperature AND degeneracy, and I never explicitly mentioned degeneracy. It turned out that the equation I had quoted in both the paper and the talk was correct, but I couldn't come anywhere close to proving that on the spot. My expected result is therefore a low pass, though it was not made explicit. That's rather unfortunate as it's possible that another month of preparation could have gotten me the high pass, but at the same time, it's well worth having the project done.
Connecting to ipython notebook with SSH tunneling
My typical ssh tunnel looks something like: ssh -N -f -L 8889:SERVER.colorado.edu:8889 ginsbura@SERVER.colorado.edu
For ipython notebooks, this approach was giving me the error: `` channel 2: open failed: connect failed: Connection refused``. The ipython notebook is at http://127.0.0.1:8888/ locally. Therefore, the correct ssh tunnel command is: ssh -N -f -L localhost:8888:localhost:8888 adam@SERVER.colorado.edu
Converting a CLASS-created .fits file to a real (FITS-compliant) FITS file
This post is to remind me, the next time I go looking, how the hell to convert from a GILDAS CLASS fits spectrum (created by fits write blah.fits /mode spectrum) to a FITS-compliant spectrum. First, remember the FITS-WCS spectral definitions: http://www.aanda.org/index.php?option=com_article&access=bibcode&Itemid=129&bibcode=2006A%2526A...446..747GFUL And the peculiar CLASS definitions: http://iram.fr/IRAMFR/GILDAS/doc/html/class-html/node84.html Key points: CLASS stores the CDELT parameter as DELTAV in m/s instead of km/s and the velocity offset of the spectral frame in VELO-LSR also in m/s. Things to set: CTYPE = VRAD SPECSYS = SOURCE SSYSSRC = LSRK VELOSYS = frame velocity (VELO-LSR or CRVAL1) This information is subject to change...
Converting GILDAS-CLASS data cubes (lmv files) to fits
As usual, CLASS documentation is nearly impossible to navigate. At the end of the CLASS "introduction" (gildas-intro.pdf) there is a subtle and obscure reference to the vector\fits command. The conversion is actually relatively straightforward: vector\fits outfile.fits from infile.lmv AG
cython vs f2py
I had a go at optimizing some code this past week, and ended up learning to use both cython and f2py. f2py is much easier to use. If you want to write a function in fortran and use it in python, all you do is write the code and add specifications using comments in the fortran code. cython is more natural to code. The code style is C/fortran-like: think in terms of loops instead of arrays. The syntax is python-like, which makes coding somewhat clearer and simpler. For my code, I found that cython was ~10% slower than fortran. Check out the plfits in: http://code.google.com/p/agpy/source/browse/#svn/trunk/
Danger Zone
The Danger Zone: Where the Buffalo Roam.
Defining a Molecular Cloud
I'm going to try to answer a few questions [this post is in-progress and will be updated]:
- Why are molecular clouds called molecular clouds?
- What distinguishes a molecular cloud from similar objects (e.g. cores, clumps, HII regions)?
- What do molecular clouds look like?
- How do astronomers look at molecular clouds?
- Why are astronomers look at molecular clouds (why are they interesting)?
- What do molecular clouds have to do with star formation and planet formation?
- How are molecular clouds involved in galactic evolution?
1. Molecular clouds are regions in space with densities high enough and temperatures low enough that molecules can form. The most common molecule is Hydrogen (H:sub:2), simply because hydrogen is the most common atom. The next most common is Carbon Monoxide, CO, that is ~104 times less common. It is the easiest to observe, though, because H2 can only emit light and energy in 'forbidden' transitions that don't happen very often. 2. Size, temperature, and composition all distinguish molecular clouds from other nebulae.
An H II region is very hot - in it, hydrogen atoms are ionized by radiation from a nearby O-type star. In regions hot enough to ionize hydrogen, all molecules are destroyed unless they have already collected into larger dust particles.
A core is a much denser bunch of gas than a molecular cloud. Cores are dense enough that their outsides shield their insides from the radiation of the rest of the universe and their insides cool to very low temperatures. At low temperatures, the pressure supporting the cloud against collapse is lower, and stars can form via gravitational collapse.
I don't really know what clumps are... they're supposed to be something intermediate between cores and clouds, but I don't know what their distinguishing features are.
3. It depends on what wavelength you're looking in. In the optical, where our eyes can see, they look dark - they're seen because they absorb light from behind them. In the infrared, the hot ones glow, but the cold ones are still invisible. If you go all the way to the millimeter, all molecular clouds glow, no matter how cold, but in order to see them there has to be a lot of material. I'll add some pictures here later. 4. Lots of ways. Optical telescopes aren't the best choice, though. With near-infrared images, we can detect molecular clouds by counting stars and noticing when there aren't as many in some regions. This technique is called "NICE" and works because there are so many stars in the sky and dust in molecular clouds includes some, but not all, of their light. In the far infrared, we can see hot dust glowing, but this is very difficult because our atmosphere is opaque in the infrared - it's like trying to look through a brick wall. We need telescopes in space to be able to see anything at these wavelengths. In the millimeter, there are some 'holes' in the atmospheric absorption, sort of like color filters, that we can see through to detect the coldest emission.
Delirium Nocturne
Delirium Tremens is a tasty beer itself, I think. I'm drinking Delirium Nocturne right now. It has a cool (opaque!) bottle, plus a pink elephant on the label. At 8.5% it's pretty strong, but very drinkable with no taste of alcohol. It's definitely my favorite Belgian brown now. It still has that wheat-soy-sauce taste to it, but very weakly. It poured with ridiculous head - I could not control it. Overall, great beer, worth the ridiculous price I paid for it, whatever that was.
Detexify!
for when you can't remember how to draw an angstrom: detexify .. image:: http://2.bp.blogspot.com/_lsgW26mWZnU/SydQdSeP5EI/AAAAAAAAFds/f8-29Qvef3E/s400/detexify.png