04.05.2014 08:24

More open sensor format discussion

Originally posted here: https://github.com/tkurki/navgauge/issues/4

Kees Verruijt pinged me discussion and suggested I should add my $0.02 on requirements. Interesting discussion. For me, JSON is nice to have on the side, but I can't really consider a text based message as the primary format. I've been thinking about how to replace things like Generic Sensor Format (GSF) and the zillion vender binary formats for logging multibeam and sidescan sonars, LIDAR (e.g. tons of work in the PulseWave project), high rate intertial navigaton systems (IMU/INS), the full working set of what can be expressed for smaller sensors with NMEA 0183 messages, AIS, Radars, imagery, carrier phase GPS (aka RTK, Rinex format,...), acoustic recoders (hydrophones and geophones), e&m, an open version of MISLE ship incidents, whale sitings, data license, software versions used, the whole metadata world (ISO paywalled disaster), etc. And then think about all this combined for a very large number of platforms... e.g. all AIS receivers in a network, all buoys, AUV, ASV, etc in a fleet. On some systems, either because high resolution timing is key or because they are underwater for long periods of time without GPS (AUVs or dataloggers that may sit on the sea floor for months), multiple time sources may be essential. And I would love to see more at the OpenROV, OpenCTD, Arduino/Raspberry Pi level for cheaper and smaller sensors. We then as a community need the ability to conflate data over long time series. e.g. Maritime Rule X changed on July 1st, 2007... how did the env change in the years before and after? Did that increase or decrease regional fuel use, ship incidents, whale strikes, ship noise, economic activity at local ports, etc? I'm already see streams that are GB to TB per day and need to scale up from there. For the browser, JSON is great for small things, but the overhead of strings to binary for rendering of say just 10M points in Chrome, forced a switch to binary over the wire.

Even just the ARGOS float project has the potential to explode... they only have a few sensors right now, but what happens when they get tons more sensors and multiple routes for data to make it to a usable datastore.

I've been worrying about this for a long time now. I saw the issue with the number of sensors on the Healy all trying to spew NMEA 0183-ish stuff to NAIS with the USCG to working with sonars. I tried to start writing some of my ideas here: http://schwehr.org/blog/archives/2014-02.html#e2014-02-18T09_06_12.txt (and other blog posts). However, I haven't gotten very far. I tried making a message definition language of my own to support AIS, but did not get very far in the RTCM community. My current thought is that the best core definition place to start is with Protocol Buffers (ProtoBufs) and RecordIO or similar. That would allow for other serializations, but start off with a well defined message content that has a build-in binary serialization combined with the ability to add other serializations (e.g. JSON, XML, proto ascii, etc) to the mix fairly easily. I'm also realizing that there is a need for a journalling style system (append only) that could allow for index messages to be written to log files that would make access to logs much faster. This becomes especially important if you are later trying to join chunks of logs that might come anywhere from near realtime to months after the fact. There are plenty of other issues in there... e.g. nothing seems to talk about recording the calibration process in sensor log streams or things like time from which source and how accurate is it? And what license is the data released under? The bathymetry attributed grid (BAG) format contains ISO XML metadata, but they only have "classified" / "non-classified" for data. What about public domain, proprietary, creative commons license X, etc. ?

I think any spec has got to be totally open and it would be best to be machine readable and usable for all for any purpose. e.g. paywalled or GPL are a real problem. Something like the Apache 2 license seem like the only way to go if we really want to open up data interchange.

I've spent a lot of time trying to survive the existing disaster of standards and want to be moving towards a world in which data is easier and more fun to work with from small devices to global scale fleets of sensors. Hopefully some of these thoughts are useful to the discussion you all are having here.

Some links to material that might be interesting for you all:



P.S. GSF is now homed at Leidos (which SAIC spun off) here: https://www.leidos.com/maritime/gsf (and still not yet under an open source license).

Posted by Kurt | Permalink

04.03.2014 10:33

Disabling autologin on a mac from the command line

Just did this remotely on a mac. I had historically left autologin running. This was blocking some system admin tasks. Here is what I did to fix it.

http://www.cnet.com/news/how-to-disable-automatic-log-in-via-the-command-line-in-os-x/ and http://superuser.com/questions/40061/what-is-the-mac-os-x-terminal-command-to-log-out-the-current-user
ssh myhost
ps -aux | grep $USER
24
defaults read  /Library/Preferences/com.apple.loginwindow autoLoginUser
schwehr
sudo defaults delete /Library/Preferences/com.apple.loginwindow autoLoginUser
sudo osascript -e 'tell application "System Events" to log out'
ps -Ajc | grep loginwindow | awk '{print $2}' | sudo xargs kill
ps 

Posted by Kurt | Permalink

03.19.2014 19:08

Rebecca Moore at the White House - Google Earth Engine


Posted by Kurt | Permalink

03.17.2014 10:45

AIS talkers for NMEA VDM/VDO messages

I got an email with this question:
In my data feed m getting some sentences with packet types I do not
recognize.  Could you tell me what they mean?

!BSVDM
!ABVDM

I'm also getting multi-sentences with different packet types. Is that
allowed? First part can be ABVDM and second part AIVDM.
I was just asked in email to explain the first field in the CSV that is the AIS NMEA armored messages. Sadly, the NMEA specification is paywalled by the annoying folks at NMEA so I can't share with you where talkers are defined. The same goes for the IEC standards that define AIS device messages and testing procedures. And I only have draft versions of a few of those that people on the commitees have given me over the years. For example, I have "Maritime navigation and radiocommunication equipment and systems - Digital interfaces - Part 100: Single talker and multiple listeners - Extra requirements to IEC 61162-1 for the UAIS".

Back before ITU 1371 was opened up, I helped Eric Raymond (ESR) a bit with his writing of AIVDM/AIVDO protocol decoding. I still think that his document is drastically better written than the crap that is in the ITU documents. ESR also has NMEA Revealed with a list of talker ID.

Let's start with a sample message:
!AIVDO,1,1,,B,403OvpQuUwgBMJRe6VE5Aai005q8,0*6F,b003669730,1249053583
The first two letters after the bang ("!") are the "talker." In NMEA speak, this is the device that generated the text in your local environment. It has nothing to do with any remote AIS devices and what they sent over the VHF radio. Above, we have "AI". This is the classic "AIS" device talker. It is what you get with most AIS receivers and transceivers (they are NOT transponders).

Following the talker are 3 letters that define the "sentence type." In the world of NMEA, this is the type of message that you are about to parse. VDO is a type that not many people know about. It says that the message that follows is an AIS message that is about "own ship." This really means that the device is talking about itself. The more commonly observed VDM means "AIS message that the device received over a VHF radio channel." The VDO in the above sample message makes sense as this is an AIS basestation telling us the message 4 (basestation report) for itself. The message has two extra fields on the right: ",b003669730,1249053583". This is the older USCG logging format and has a receiver identifier and UNIX UTC timestamp. The b at the beginning of the receiver name says that the message was logged from a basestation. If we decode that message, we should find that the MMSI of the basestation report should match the receiver name.

This next sample message has "BSVDM". The talker is "BS". This means that we are hearing from a basestation. The logger has added ",r003669717" with an "r" for receiver, but it might just be missconfigured. This message is from my 2006 notes and I was just figuring out AIS and what the USCG was doing back then. I did have a basestation on my desk for a while.
!BSVDM,1,1,,A,85MwpwQKf7sgdeioePgqI0@Q;F=0q3va1sdLFPCS4wB7PtDMD:62,0*34,r003669717,1165850344
And a final sample that has "ANVDM". This was an AIS ATON (aid-to-navigation) device that uses "AN" as the talker:
!ANVDM,1,1,,B,35MnbiPP@PJutdlH2M1sIq:80000,0*10,r003669947,1294012804
As for "AB", I was unsure. I did receive some sample data with an AB talker, but I do not know what kind of device was doing the logging. AB might stand for Ais Basestation in someones mind. Looking at NMEA's closed documentation (yes, I paid rediculous amount for version 4.0 of the spec), I found:
AB Independent AIS Base Station
AD Dependent AIS Base Station 
AI Mobile Class A or B AIS Station
AN AIS Aids to Navigation Station
AR AIS Receiving Station
AS Limited Base Station
AT AIS Transmitting Station
AX AIS Simplex Repeater Station
SA Physical Shore AIS Station
There was no "BS" talker in there. Don't ask me to explain the differences between all those codes.

Please, please, please: If you have the opportunity to talk to people at NMEA, IMO, IALA, IHO, ITU, or IEC, tell them that closed / paywalled specifications are the work of the devil (and contradictory to the mission of SOLAS). It's bad enough that people aren't willing to put the discussion leading up to these specs online with their names attached. Can we please at least get ALL specs for these public data standards (because they are mandated by law in many cases) out there for free distribution licenses?!?!?

Posted by Kurt | Permalink

03.06.2014 21:43

Software Defined Radios

Last time I went looking for Software Defined Radios (SDR) to play with, the price was in the $300-$500 and up. That is quite different than some of the options we have today. I just ordered at NooElec NESDR Mini SDR & DVB-T USB Stick (R820T) w/ Antenna and Remote Control. It's a R820T tuner IC made by Rafael Micro with a frequency range of ~ 25MHz to 1750MHz. It apparently can output 2.5M S/s (million samples per second).

Posted by Kurt | Permalink

02.24.2014 00:12

pkg-config

I had never really used pkg-config before today. Turns out to be reasonably simple. I read through the man page an here is me messing with it.
type -a pkg-config
pkg-config is /sw/bin/pkg-config

dpkg -S /sw/bin/pkg-config
pkgconfig: /sw/bin/pkg-config

fink list -i pkgconfig | grep -v virtual
 i  pkgconfig                0.25-2  Manager for library compile/link flags

pkg-config --help
pkg-config --list-all
pkg-config --list-all | wc -l
    300
pkg-config --list-all | head -4
gstreamer-tag-0.10 GStreamer Tag Library - Tag base classes and helper functions
libmetalink libmetalink - Metalink library
xf86rushproto XF86RushProto - XF86Rush extension headers
fontsproto FontsProto - Fonts extension headers

pkg-config x11 && echo 'yes'
yes

pkg-config missing && echo 'yes'
pkg-config missing || echo 'do not have "missing"'
do not have "missing"

pkg-config --print-variables x11
xthreadlib
exec_prefix
prefix
includedir
libdir

pkg-config --print-provides x11
x11 = 1.6.2

pkg-config --print-requires x11
xproto
kbproto

pkg-config --cflags x11
-I/opt/X11/include

pkg-config --libs x11
-L/opt/X11/lib -lX11

Posted by Kurt | Permalink

02.18.2014 09:06

A unified marine sensor logging format

This is the first of a series of posts I hope to write.

https://xkcd.com/927/



See also: Toils of AIS by Eric S. Raymond and Kurt Schwehr.

I am frustrated that each sub-community of marine related sensors has their own data formats for raw logging. Even worse, each vendor has their own format (with many different messages as they innovate over the years). Those formats require a wide range of support libraries and often not easily extensible. We have formats for multibeam sonars, seismic systems, acoustic recorders (hydro and geo-phones), seismic stations (really just a 3-axis, 3-channel geo-phone), lidar, ships' and GPS sensors, magnetometers, radar, grids, images, vectors, metadata, and more. Those formats treat data in other formats as orthogonal. Most of encodings used for those formats are not even described in the Wikipedia article on Comparison of data serialization formats. Worse yet, some key standards are behind paywalls, which greatly hampers adoption (e.g. try pricing all the ISO standards related to metadata). On top of that, some of these standards are so confusing that it is nearly impossible to create conformant data streams.

In this series, I will be arguing that the sonar community needs to get away from systems like the Generic Sensor Format (gsf) and move to a unified machine and human readable specification system for the data stream at the lower levels. I will be pushing for using ProtoBuffers (protobuf). Protobufs were created by Google and are heavily used within Google. I have a fair bit of experience with them as I have been working for Google for the last 2 years. However, there are other formats, such as BSON, that could work. I do feel like I'm repeating myself with this. I argued for a custom XML representation of AIS messages starting in 2006 and declaring defeat in 2010 as the AIS community opted to stick with their amazingly terrible status quo.

An additional concern is that people have proposed formats or tools that are either only used in a small community, are not tested across a number of key programming languages or revert to defining the raw bit level packaging such that the software engineers of every piece of code that will touch this format will have to understand the low level details.

In this first post, I'd like to explore the use cases. What are the range of problems that we are trying to solve? I'm attempting to think about 50 or so years into the future, a near impossible task, but some things will most certainly be correct. Storage of data will become more affordable in terms of cost, power required, volume/mass required. Compute power for an available budget will increase. Sensors will become more affordable and their diversity will increase.

I have not had a chance to flush out these use cases, but I do not want to wait any longer to get this starter post on line.

* Single beam echo sounder from a ship

* Multibeam echo sounder ship

* Multi-channel seismic vessel

* Lidar from an aircraft

* An AUV, Glider, Argos style device with IMU

* Super cheap land sensor networks without GPS

* ADCP mounted on the sea floor

- without external intervention for long periods - event based time corrections

* An array of sea floor devices

- down for an extended period with slow intermittent acoustic connections

* Ping editing a multibeam sonar

* Post processing to get time and location

* Post collection indexing to allow fast access

* Later splicing of multiple platforms into a single stream

- inlining of forward and backwards index messages

* Non-survey ship underway

Posted by Kurt | Permalink

02.07.2014 10:50

Google Docs Mac OSX 10.9 Mavericks

For my desktop, I have linux (I gave up on macs for the desktop), but for my mobile devices, I've been using Mac devices. Mac OSX is really frustrating at a low level compared to Linux, but I like the hardware (minus the lack of easily fixing). Here is what I do to survive in the Mac OS X world. I've given up Microsoft Office, Photoshop and Illustrator now that I'm not a full time professor making posters and living in the painful UNH IT environment. I used to do this in MediaWiki format inside of CCOM, but I don't want to have to tunnel into CCOM for this and I'd like it to get more eyeballs. I rarely got feedback from others, so why would I keep it closed? I will be leaving out stuff specific to the places I work in a lot of cases, but I will share a bit when it doesn't expose confidential business practices.

At the time of this post, the document is hardly even a Work-In-Progress (WIP).

Mac OSX 10.9 Setup

Posted by Kurt | Permalink

02.07.2014 10:17

AutoAwesome winter in NH

AutoAwesome on G+ is often just cheesy, but I really like this one.


Posted by Kurt | Permalink

01.23.2014 06:03

Cross country flight

I just recently got to fly from West Coast to East Coast (US that is). I always enjoy trying to capture good pictures when I have access to a window. A few years back, I always seemed to get people sitting next to me that claimed there was no way to get good photos from a commercial flight. Thankfully that has dropped off.

Images taken with an iPhone 4S. The color is never quite up to what I see with the eye and iPhoto's color tools are not good enough to do much of a fix.





The Sierras should be totally blanketed in snow by late January. This photo scares the heck out of me. California is in for a serious drought this year.



The Southern Snake Range / Great Basin National Park:





It's very sad to see hydrolic fracking pads all over the place. I only recently learned to look for. Ignore the water treatment on the right and look for the mostly bare earth rectangles. What are we doing to the lands that we have to live and grow food on and the aquafers that provide drinking water? Short term gain (for some) and long term disaster (and we don't even know what all trouble we are in for). See also: frack.skytruth.org


Posted by Kurt | Permalink

01.01.2014 09:30

Starting out 2014

I'm not as cool as people like pydanny with posts like New Year's Meme 2013 that cover What's the Coolest Thing You Discovered This Year, What New Programming Technique Did You Learn This Year, Which Open Source Project Did You Contribute to the Most This Year? What Did You Do, What Are the Top Three Things You Want to Learn Next Year, and What is the Top Thing You Wish Someone Would Write Next Year?

I just try to survive the onslaught. As a part of that I have some rituals. The most important one is that I keep a logfile where I track what I do every day in emacs org mode. I have a top level ('*') entry for each day and '**' entries for each general thing I work on during the day. For each day, I usually start with a '** todo :todo' section. I write in there all the things that I hope to get done that day in the form of a org todo list in the form of '- [ ] fix bug foo'. I'm always overly ambitious and rarely get them all done. In fact some of them require more than that day or are out and out not in the range of things I can get done. I have a 'todo' alias that greps the current year for all lines that start as '- [ ]' (or contain *+ TODO). I often get to the point in the day where I'm totally overloaded with the current task and just staring at the screen. That's a good time for a task swap. So I run my todo alias and look through the back log trying to pick off a few small things to keep me going until I'm ready to get back into the current fight. That list gets long as the year moves on.

The best part of the new year is that I make a new org mode file and get to start with nothing on the todo list. I then skim last year's list and pull forward only those things that still feel like they should be in my conciousness. Those rafted todos are still important, but they don't deserve to cloud my mind. They serve to document what I was trying to do, things I thought would be nice, etc. They also provide a pool of potential side projects for others. Google has the concept of the 20% side project and I sometimes get asked by people how they can help. If I need inspriration, I can look back through past years of todo items. As always, I have to thank Anne Wright for kicking me into the one log per year mode (and away from being dominated by un-grep-able paper log books) back in 2004 in MER mission control during a middle of the night image processing hack session with the Ames crew.

The other thing I do is to reset my shell logs. I log every command that I type on the command line on my work laptop and desktop (that returns successfully). That's about 50k commands each. It logs when, what git branch I was in, what was the current directory, and the command. Admittedly, this is pretty boring as it is dominated by ls, cd, egrep, find, locate, emacsclient, git status, git diff, and so forth. But, with that many commands, there are quite a few gems and I grep for every day. At the New Years, I move the current log from .shell.log to .shell.log.YEAR.

Top shell commands for 2013:
  1. ls 13289
  2. cd 9170
  3. git 5162
  4. grep 2349
  5. rm 2170
  6. find 1855
  7. less 1722
  8. fink 1312
  9. schwehr 1209 # Alias to ssh into my desktop
  10. ipython 1199
  11. mv 985
  12. open 953
  13. svn 949
  14. branch 848 # Alias to create a git branch
  15. ssh 795
  16. du 764
  17. make 720
  18. cp 713
  19. pwd 712
  20. ec 695
  21. mkdir 664
  22. e 637
  23. df 633
  24. echo 602
  25. scp 601
  26. gcutil 498
  27. gdalinfo 496
  28. cat 486
  29. tar 462
  30. bq 433
  31. gsutil 424
  32. fg 406
  33. python 406
  34. bg 406
  35. type 396
  36. sudo 394
After using the cut command to pick out the field with the command, I used this little bit of python to print the summary:
#!/usr/bin/env python

import collections
import os

cnt = collections.Counter()

for line in open('cmds'):
  cmd = os.path.split(line.strip())[1]
  cnt[cmd] += 1

for cmd, num in cnt.most_common(40):
  print '<li>' + cmd, str(num) + '</li>'

Posted by Kurt | Permalink

12.24.2013 16:30

Hello go (golang)

I've been sick for 10 days and am going totally stir crazy. I figured it was time to finally get over the initial kickstart of using go. Till now, I've always left the direct use of go to my collaborator Francesc. I've read his code and it performs nicely. So it's time for me to try out go in my own way. Thanks to Brendan, there was an initial fink go package for version 1.1.1. However, this didn't work for me on Mac OSX 10.9. I tried fixing it, and bumped into the oddities of the go src package. Some go authors refer to this as the go way. I consider this just having a weak source setup process and no install method. Once it was clear that go is just like python with the system checking the timestamps against the compiled libraries (.go -> .a for go; .py -> .pyc for python), I was able to build a go 1.2 package that even includes bash command line completion (which still acts a bit strangely) and the emacs go mode. So being that I'm a fink maintainer, let's start there.
fink install go

echo "(require 'go-mode-load)" >> ~/.emacs

type go  # /sw/bin/go

go version
# go version go1.2 darwin/amd64

go help  # Lots of command options on the screen

cd
mkdir -p go/src/hello  # Violate the go tree layout for now cause we don't need it yet.

cd go/src/hello
 
cat > hello.go << EOF
package main

import "fmt"

func main() {
	fmt.Printf("Hello, world.\n")
}
EOF

go run hello.go
# Hello, world.
 
go build hello
can't load package: package hello: cannot find package "hello" in any of:
	/sw/lib/go/src/pkg/hello (from $GOROOT)
	($GOPATH not set)
So that was not really the right way to setup your go projects, but it was easy enough to get a kickstart. How about something a little more interesting? I like regular expressions (regexp) and want to see how it would be to write a NMEA parser in go. I got stuck right off, but I got help over on stack overflow: Using named matches from Go regex. I really prefer to use named fields for regular expressions. It's slower, but the regular expressions are much more understandable.
mkdir ~/go/src/regexp

cd !:1  # !:1 says use the argument at position 1 in the prior shell command

emacs demo_regexp.go
Then put this code into emacs:
package main

import (
  "fmt"
  "regexp"
)

var myExp = regexp.MustCompile(`(?P\d+)\.(\d+).(?P\d+)`)

func main() {
  match  := myExp.FindStringSubmatch("1234.5678.9")
  result := make(map[string]string)
  for i, name := range myExp.SubexpNames() {
     result[name] = match[i]
  }
  fmt.Printf("by name: %s %s\n", result["first"], result["second"])
}
go run demo_regexp.go 
# by name: 1234 9
Now to change it up, I wanted to try go via Google App Engine (GAE). Google App Engine in Go. The download is a little bit confusing. I expected an installer or something. What I found was a zip that you have to unpack and put in your path. I guess I should make a fink package in my personal package tree for that, but some other day.
fink install wget unzip
wget http://googleappengine.googlecode.com/files/go_appengine_sdk_darwin_amd64-1.8.8.zip
unzip go_appengine_sdk_darwin_amd64-1.8.8.zip
cd go_appengine
export PATH=`pwd`:$PATH
mkdir -p ~/go/myapp/hello  
cd ~/go/myapp/hello  
cat << EOF > hello.go
package hello

import (
    "fmt"
    "net/http"
)

func init() {
    http.HandleFunc("/", handler)
}

func handler(w http.ResponseWriter, r *http.Request) {
    fmt.Fprint(w, "Hello, world!")
}
EOF
cd ~/go
goapp serve myapp &
open http://localhost:8080
I know I do things differently than the average person, I'm not on the Go team, and this is definitely not a sanctioned tutorial, but it's how I got started. I have tons more to do.

Posted by Kurt | Permalink

12.13.2013 17:58

Gebco multibeam cookbook

I was talking to people at AGU about common issues in multibeam sonar data. I made a comment that there is not great documentation out there that goes through the common issues seen in multibeam covering what the issues look like, what causes them, can they be prevented, and can they be fixed after the fact? Someone (perhaps Dave Sandwell) asked if I had seen the Gebco Cookbook. I hadn't.
6.2.4 Common multibeam swath sonar errors
Spikes under nadir
Tracking water-column noise
Roll-over on inward-facing steep slopes
Anomalies on outward-facing slopes
Navigation missing on first ping
Inaccurate sound speed profile used
Here is one of those:
Tracking water-column noise 
 
Modern multibeam swath sonar systems generally include automated
bottom tracking algorithms. Such algorithms search for seafloor echoes
in a window centered on the previous (or adjacent) ping’s
depths. However, the system can begin to track water column noise, and
may continue to do so if the system is not actively monitored. Depth
variability between pings can be large, the across-track depth
profiles are not consistent with seafloor topography, and most of the
beams/pings should be discarded.
There aren't any images and the text is pretty short. So now I've seen it, but it seems pretty weak. This stuff is definitely a start, but it's so far from what I'd like to see in the end. And the work is copyrighted in a way that I'm not going to contribute to it: "© Copyright International Hydrographic Organization [2012] " It would be nice if were at least some sort of open documentation license.

Posted by Kurt | Permalink

11.22.2013 12:40

NextGov Bold award for Whale Alert

Congrats to Dave Wiley and everyone who made Whale Alert possible. We are quite the large, distributed and non-traditional team.

Nextgov Honors Feds for Innovative Tech That Saves Time, Money and Lives
A NASA hackathon, an app to track whales and an initiative to allow
federal employees to use their own smartphones and gadgets for work
were among projects recognized Wednesday with the first annual Bold
Awards.

Nextgov received nearly 200 nominations for the contest. Our editorial
staff selected 19 finalists -- individuals and teams from a dozen
agencies -- that exemplify the kind of creative problem-solving,
technical acumen, ambition and persistence we frequently hear about in
the private sector but too seldom learn about in government.
...
There were 7 winners, including:
David Wiley of the National Oceanic and Atmospheric
Administration led a team to produce Whale Alert, a mobile app to
prevent endangered right whales from colliding with ships.
Monica pointed out a minor note to the above description. It would really be better stated that this app is for preventing ships from colliding with the right whales.

What Politicians Don't Tell You About Federal Employees - Nextgov Bold Awards Finalists

DAVID WILEY, LEILA HATCH and MICHAEL THOMPSON (left to right) at
NOAA's Stellwagen Bank National Marine Sanctuary created Whale Alert,
which provides up-to-date information about the critically endangered
North Atlantic right whale, displaying data on nautical charts via
iPad or iPhone platforms. To protect right whales from strikes shipthe
leading cause death ofNOAA has promulgated a suite of rules and
voluntary measures that affect East Coast shipping. Prior to Whale
Alert, information was delivered to ships in piecemeal fashion, making
compliance difficult and resulting in industry fines. With more than
17,000 downloads, the app has been highly successful improving
conservation efforts.

Posted by Kurt | Permalink

11.22.2013 10:23

Google's Earth Engine use for forest evaluation


Posted by Kurt | Permalink

11.15.2013 07:14

Real data has real warts

Dale Chayes just sent me a note as he was watching one of my videos and heard me use his "real data has real warts" quotes. Ever since Dale told me his little catch phrase, I can't get it out of my head. It's too true.

I hadn't seen this particular talk before of Dale's, but I'm pretty sure that he said that quote to me a few years before this talk.

Geophsyical Data: "Real data has real warts"

ChayesDeesNoonBalloon2010-11-09.pdf

Some pretty awesome slides in that talk. This one particularly spoke to me with the antenna farm...


Posted by Kurt | Permalink

11.14.2013 10:31

Mapping Big Data with Google's Cloud Platform

Google Maps API G+ post



Strata London: Ships Around the World: Processing and Visualizing Large Global Datasets with the Google Cloud.

Mano's G+ post. The slide deck G+ post and the presentation is here: slides



I really appreciate the great audience questions and the discussions with folks at the office hours with the Cloud team were awesome!

Posted by Kurt | Permalink

11.10.2013 09:04

Lots of happenings with Google

Too much going on to get a chance to blog about all of it, but here are some highlights.

Tomorrow (Monday), Mano Marks and I present Ships Around the World: Processing and Visualizing Large Global Datasets with the Google Cloud at Strata London. The Google Cloud team's Amy Unruh and Felipe will present Behind the Data Sensing Lab: Building a Highly Scalable and rapid data collection and analysis just before us at 1:15. Then they will be leading office hours at 2:45 right after Mano and I present.

Then Francesc Campoy presents Go see all the ships in the world at DevoXX on Thursday in Belgium.

Ujaval is visiting from India. I got to help him on the QGIS GME connector by doing code reviews. It was definitely fun to see how a QGIS plugin works!


Posted by Kurt | Permalink

11.07.2013 13:20

USCG Proceedings article on Marine Spatial Planning

http://www.uscg.mil/proceedings/archive/2013/Vol70_No3_Fall2013.pdf

Marine Sanctuaries and Marine Planning by Wiley, Hatch, Schwehr, Thompson and MacDonald.




Posted by Kurt | Permalink

11.04.2013 09:17

FORTRAN and go to / goto

This is just out and out evil. Do not use go to / goto. Don't use fortran. Just don't. And if you use fortran and gotos, never use goto with a list of jump points and an expression to give yourself multiple destination options for "fan-out" or case like statements.

Here is me trying to figure out how gotos really work in fortran. And I didn't get through all the cases that I needed to try before I ran out of gas.
c -*- fortran -*-
c
c Using gfortran 64bit on mac osx 10.8 from fink
c GNU Fortran (GCC) 4.8.2
c Compiler warnings (if anything new) included after each code
      program wtf
      go to 130
c
      write(6,*) 'anything'
 80   write(6,*) 'label 80'     
 130  write(6,*) 'label 130'  ! jumped here
c
      end
c This code printed "label 130"
c
c 
c 80   write(6,*) 'yes'                                   c            
c   1
c Warning: Label 80 at (1) defined but not used
c
c ========================================
c
      program wtf
c
      assign 130 to myjump
      go to myjump
c
      write(6,*) 'anything'
 80   write(6,*) 'yes'     
 130  write(6,*) 'label 130'  ! jumped here
c
      end
c
c The code printed "label 130"
c
c Warning: Deleted feature: ASSIGN statement at (1)
c
c 130  write(6,*) 'this is statement 130'                     c          
c    1
c Warning: Label 130 at (1) defined but 
c
c ========================================
c
      program wtf
c
      assign 130 to myjump
      assign 80 to myjump
      go to myjump
c
      write(6,*) 'anything'
 80   write(6,*) 'yes 80'     ! jumped here
 130  write(6,*) 'this is statement 130'
c
      end
c
c ========================================
c
      program wtf
c
      go to (80, 130), 2  ! use the 2nd item in the list
c
      write(6,*) 'anything'
 80   write(6,*) 'yes 80'
 130  write(6,*) 'this is statement 130'  ! jumped here
c
      end
c
c ========================================
c
      program wtf
c
      go to (80, 130), 3
c
      write(6,*) 'anything'  ! NO jump.  ran from here down
 80   write(6,*) 'yes 80'     
 130  write(6,*) 'label 130'
c
      end
c
c The code did not jump and I saw anything, yes 80 and label 130
c
c ========================================
c
      program wtf
c
      assign 130 to mytest
      go to mytest ,(3,5)
c
      write(6,*) 'anything'
 80   write(6,*) 'yes 80'     
 130  write(6,*) 'this is statement 130'
c
      end
c
c DOES NOT COMPILE
c
c Error: Label 3 referenced at (1) is never defined
c 5.f:4.24:
c
c      go to mytest ,(3,5)                                               
c                        1
c Error: Label 5 referenced at (1) is never defined
c
c ========================================
c 
      program wtf
c ignores 80 and 130
      assign 80 to mytest
      go to mytest ,(80,130)  ! Always jumps to where ever mytest points
c
      write(6,*) 'anything'
 80   write(6,*) 'yes 80'     
 130  write(6,*) 'this is statement 130'
c
      end
I am missing cases like checking to make sure that assign and an integer variable with the same name are two separate entities. Additionally, if the programmer uses a reserved word (e.g. return) for a variable or assigned label? And after all that, do other compilers behave the same? FORTRAN is definitely a SNAFU. When I was at SIO for grad school, I had a good number of professors tell us, the students, that FORTRAN is the fastest language in existance. Oh really? These same professors didn't know about data structures. I tried talking about singly or doubly linked lists or trees and got blank looks... especially when I rewrote their big array codes to not be one giant blob of memory with endless memory copies. Sadly, the reverse was true. I don't grok einstein notation, hilber spaces or other field theory things I was trying to learn from them. They assumed those would be trivial to me like a linked list is. Why do mathematicians insist on one symbol variable names and extreme compression on their notation? I think that's my biggest problem with math. That and studying a lot of these topics before wikipedia was flushed out.

Posted by Kurt | Permalink