Reading this in 2026, I realize my process for working through problems has shifted — less sifting through internet posts, more AI banter (Claude, Gemini, ChatGPT). I'm keeping this page up anyway, because one of the most boring posts below is what prompted someone to reach out — and then become an intern, a colleague, then a friend.
|
Most of my day-to-day activities when I'm not on an exciting expedition are related to developing and testing new ideas and research as well as software and mechanical devices. I think of the process as: Like so many others, when I know a task should be fairly standard (e.g., yesterday I wanted to convert ten thousand .pgm files into .tiff files.), but I don't know how to do it, I Google it and hopefully an Internet user or blogger-from-heaven has written about it (THANK YOU "Matt"; I installed ImageMagick and used it's 'convert' function). When Google doesn't have the answer, however, I take extra time to figure out the solution, usually by asking people around the lab or using the trial-and-error-plus-reading-bits-of-the-manual method. I'm going to try to give back more by recording oddball technical things that might save others time and frustration. It's like take-a-penny leave-a-penny with the Internet. I realize this changes the audience of my blog; so I'm posting these under this new page "Pennies." This is a guest post by the incredible Kathryn Ellen Whittey from Cardiff University! The two of us ran a workshop together at the 2017 European Coral Reefs Symposium about how to create and analyse 3D models of coral reefs using PhotoScan and Rhinoceros 3D ("Rhino"). The methods are outlined in my PloS One publication "Cost and time-effective method for multi-scale measures of rugosity, fractal dimension, and vector dispersion from coral reef 3D models," and here they are explained in a slightly friendlier way. If you've issues please comment below and we'll work something out. I've also published 3D modeling teaching notes (link here) and a tutorial on how to rotate and scale models in Rhino (link here). Onto Kath... !
Note that this video has NO AUDIO. Video 3: Measuring Rugosity
Note that this video has NO AUDIO. Questions/comments/something still not working? Comment below and we'll work it out.
Last autumn in Oxford I had an enlightening coffee with Katherine Fletcher, coordinator of Oxford Cybersecurity. Cybersecurity isn’t a department at Oxford. Rather, it's a community of a dozen or so principal investigators and nearly 90 doctoral students spread across 26 departments, faculties, and institutes. It takes a full-time coordinator like Katherine to keep track of everyone. For example, there are computer scientists working on network security, mathematicians working on anomaly detection, criminal philologists working on understanding adversarial mindsets, polity thinkers working on legal implications (e.g., my fellow Marshall Scholar and author of The Cybersecurity Dilemma), and others. If cybersecurity were its own department, it would pull in ~£33 million in funding. Needless to say, it's an incredibly pertinent topic today as there have been massive cyber attacks in the past few years, on land and at sea, causing billions in damage. As chaotic as this interdisciplinary subject may appear, it reminds me of the situation Oceans@MIT highlighted at MIT: that an ocean researcher is spread somewhat thin across the Ocean Engineering sub-Department, the Mechanical Engineering Department, the Earth, Atmospheric, and Planetary Sciences Department, and our sister institution, Woods Hole Oceanographic Institution, among others. U.S. Navy ships conducting security exercises off the coast of Southern California. Credit: U.S. Navy labeled for reuse. What did we talk about? Cybersecurity meets oceans! I’m interested in the unique challenges facing offshore industries, especially those that have environmental consequences. These include security of cargo, security of open-ocean lanes, and offshore drilling and resource-dredging activities. I’m also interested in coordination between companies, including how new data regulations may yield win-win scenarios with research groups. Companies in the EU will soon be taxed on their data through the General Data Protection Regulation. What if it was more beneficial for companies to buy data off each other instead of, say, conducting repeated seismic airgun survey, which causes wide-spread marine disasters. DNV-GL and other insurance companies have already expressed interest in this sector; e.g., DNV-GL, Lloyd’s Register with the Royal New Zealand Navy, and the group “Be Cyber Aware at Sea,” plus not surprisingly most don't publish much on what they're doing in this sector. I have much to learn.
One thing we did during NASA Frontier Development Lab was write a script to automatically mask the signal in delay-Doppler images. Examples below. This is useful for planetary astronomers. If you'd like to access this and its documentation, please contact me using my contact page. The documentation (version 1.0) is also online here. This was developed while at NASA Frontier Development Lab, working with Agata Rozek, Sean Marshall, Adam Cobb, Justin Havlovitz, Chedy Raissi, Michael Busch, and Yarin Gal.
I've recently been thinking about the theory of diminishing returns. It's like the tan function applied to business/economics concepts: For example, say you're investing loads of money and effort into safety. That's great, but at some point, no matter how much money you throw at the problem, you can't get to a 100% safe system. Something can always go wrong. #randomthoughts
It's not secret that installing openCV is by far the most annoying part about using it... but once it's running on your machine it's great! That said, to make a simple script super user-friendly I'd rather not require (or assume) that a user already has compiled openCV. I recently found myself only needing the function cv2.blur from openCV. To implement the same function without openCV, you can use the function scipy.ndimage.filters.convolve; e.g.:
In the above example, the array B would be the same as from:
That said, cv2.blur is marginally faster (it's written C++). e.g., I found that with an 255 x 512 array, cv2.blur was on average 0.008 seconds faster than ndimage.convolve as used above (tested on n = 50 different arrays)---and of course that scales for your larger images. So you might consider writing your code with both options -- e.g., in your import statements have something like:
This was developed with Chedy Raïssi. Hope it helps someone. Please post any comments below.
DAT files (.dat) are a common file type containing data. They can be stand-alone files, but are often accompanied by a configuration file such as a header file (.hdr). This summer my team at NASA is working with .dat and .hdr files containing radar data of near-earth asteroids. The files are generated from an asteroid modelling program called SHAPE. We are experimenting with preprocessing the data in MATLAB. There wasn't an existing way to straightforwardly ("straightforwardly") work with the .dat files in MATLAB, so we had to write our own script. In that script we use information contained in the .hdr file to read the .dat file as an image. That code is below in case it's useful to somebody. EXAMPLE DATA:
MATLAB Script:
EXAMPLE RESULTS:Run the function on example data: >> pixval = datHdrConv('data/run20040918175737'); This code was developed with Agata Rożek. Please post any comments below.
The PowerPoint below describes how to start working in Rhino to measure structural complexity. It's based off my notes from teaching students last summer. The method and associated scripts can be found in our journal publication, Cost and time-effective method for multi-scale measures of rugosity, fractal dimension, and vector dispersion from coral reef 3D models. Additionally, I've written some guides to specific topics:
|
|||||||||||||||||||||||||||||||||||||||||||||||