## 01.31.2006 15:19

### xstar nav file decimators

Here are two little python scripts to help out those who have to deal with decimating navigation data from the SIO XStar system. This will not work close to the prime meridian because of a bug in the processing software or chirp... that would be the lines that looks for +/- .1 degrees.

To use these commands, just pass a list of files in like "./filter.py foo.txt bar.txt sos.tex justshootmenow.txt"

The first script sends the results to stdout (a.ka. the screen or console).
#!/usr/bin/env python
import sys
for file in sys.argv[1:]:
print file
linenum=0
fields=line.split()
if fields[0]=='lat':
print line, # keep the header
continue    # but do NOT count it
if float(fields[0]) < .1 and float(fields[0]) > -.1:
continue # skip nasty bad nav points from the xstar
if 0==linenum%500:
print line,
linenum += 1

The next one gets a little be more tricky. It dumps the decimated data to a new file with ".500" appended to the original file name.
#!/usr/bin/env python
import sys
for file in sys.argv[1:]:
print file
linenum=0
out=open(file+'.500','w')

fields=line.split()
if fields[0]=='lat':
if linenum==0: out.write(line) # keep the header
continue    # but do NOT count it
if float(fields[0]) < .1 and float(fields[0]) > -.1:
continue # skip nasty bad nav points from the xstar
if 0==linenum%500:
out.write(line)
linenum += 1

The second script also drops any following header info about which field is which when generated by the lsd command.

Enjoy!

## 01.31.2006 11:07

### ucsd latex thesis

I just worked out how to control the citations in my thesis. I first read Getting to Grips with Latex - Bibliography Management by Andy Roberts. Then here is what I did. The beginning of my thesis document looks like this:
  \documentclass[12pt]{report}
\usepackage{natbib}
\bibpunct{(}{)}{;}{a}{,}{,}
\usepackage{ucsd-thesis}
\usepackage{graphicx}
\begin{document}

I just added the natbib and bibpunct commands. Then I modified the end of the document to look like this:
  a \nocite{schwehr2003}
b \cite{schwehr2003}
c \citet{schwehr2003}
d \citep{schwehr2003}
e \citet*{schwehr2003}
f \citep*{schwehr2003}
%\bibliographystyle{plain}  % this was numbered, which I do not want.
\bibliographystyle{plainnat}
\bibliography{schwehr}
\end{document}

The results come out something like this:
  a
b Schwehr and Tauxe (2003)
c Schwehr and Tauxe (2003)
d (Schwehr and Tauxe, 2003)
e Schwehr and Tauxe (2003)
f (Schwehr and Tauxe, 2003)
Bibliography
K.D. Schwehr and L. Tauxe. Characterization of soft sediment deformation:
dectection of crypto-slumps using magnetic methods. Geology, 31(3):203-206,
2003.

BTW, the bibtex entry for schwehr2003 is:
@article{
schwehr2003,
Author = {Schwehr, K.D. and Tauxe, L.},
Title = {Characterization of soft sediment deformation: dectection of crypto-slumps using magnetic methods},
Journal = {Geology},
Volume = {31},
Number = {3},
Pages = {203-206},
Year = {2003} }


## 01.31.2006 09:27

### messing with gsf

I took a quick look at the Generic Sensor Format (gsf) this morning and got it to compile. Here is what I changed in gsf.h:
#ifndef __APPLE__
#if (!defined (_STRUCT_TIMESPEC_) && !defined (_TIMESPEC_T) && !defined (_STRUCT_TIMESPEC) && !defined (_SYS_TIMESPEC_H) && !defined (__timespec_defined))
#define _STRUCT_TIMESPEC_
#define _TIMESPEC_T
#define _STRUCT_TIMESPEC
#define _SYS_TIMESPEC_H
#define __timespec_defined
struct timespec
{
time_t          tv_sec;
long            tv_nsec;
};
#endif
#endif # __APPLE__

Val brought up that there is a command to see everything that the c preprocessor (cpp) defines:
cpp -dM gsf.h | egrep -i 'APPLE|DARWIN'
#define __APPLE_API_STANDARD
#define __SYS_APPLEAPIOPTS_H__
#define __APPLE_API_OBSOLETE
#define __APPLE__ 1
#define __APPLE_API_STABLE
#define __APPLE_API_EVOLVING
#define __APPLE_API_UNSTABLE
#define __APPLE_API_PRIVATE
#define __VERSION__ "3.3 20030304 (Apple Computer, Inc. build 1495)"


## 01.31.2006 08:33

### input managers on osx - a security nightmare?

From: http://daringfireball.net/2006/01/smart_crash_reports
  As stated before, every installed input manager loads into (nearly)
every application. Input managers that are targeting one specific
application, such as the way Saft and PithHelmet patch Safari or the
way Smart Crash Reports patches Crash Reporter, typically perform
some identifier checking so as only to deliver their actual payload
inside the application they're targeting. But, no bones about it,
the nature of input managers is such that they're loaded into every
app on your system. The basic gist is that when they're loaded, they
check to see whether this app is the app they're looking to patch,
and if it isn't. do nothing more. Unsanity has gone so far as to
post a FAQ with example Objective-C code showing their technique:

When I read this, my first thought was "oh crap". Is this going to be the security nightmare of the Mac world? What an easy way to inject spyware.

## 01.30.2006 13:23

### Seismic Data Set Directory

This just came across the Seismic Unix mailing list. SPOT (at mtu.edu) has a Seicmic Data Set Directory. Some of this stuff costs and some of the links are broken, but looks like the start of a good resource.

## 01.26.2006 21:07

### schematic of a chirp pulse

I told Becca that I would show her how much easier it was to do a chirp pulse in python as compared to C/C++, so here it is. This is just a schematic, so do not take this as implying that this is exactly the pulse shape for the XStar.
#!/usr/bin/env python
from math import *
samples=5000
for i in range(samples):
x = i/100.
y = sin(x* (1+.7*x))
# cosine taper
y = y * (.5 - .5* cos ( i * 2*  pi / float(samples)))
print x, y

I then ran the program like this to see what it produces:
./taper.py > taper.dat
gnuplot
gnuplot> plot 'taper.dat' with l


This produces a tapered pulse that looks like this:

SVG version if your browser supports SVG and assuming that gnuplot is putting out totally valid SVG that will work on the web.

NOTE: See Gutowski et al. 2005 for more details on CHIRP.

This example uses the Hann window

## 01.14.2006 21:38

### median destructive field

I just finished writing a little python program to calculate the median destructive field (MDF) for my cores. This should give me a sense of the coercivity the magnetic carrier minerals down core. Clearly they are pretty different at the top of the core. The program is called mdf.py. Right now, it only works off of sqlite databases layed out in the same way as my thesis.

The purple line is the maximum applied afdemag field in mT. Here is a bit of the output from mdf.py:
./mdf.py -n 4
# Median Destructive Field (MDF)
# samplename, mdf, depth, numberoftreats(n), minInt, maxInt, minTreat, maxTreat
bp04-4gw-s1-006 35.2 6.0 12 2.039e-06 0.0001741 0.0 100.0
bp04-4gw-s1-009 34.3 9.0 12 1.695e-06 0.0002063 0.0 100.0
bp04-4gw-s1-012 36.9 12.0 13 3.671e-06 0.0002324 0.0 100.0