Giter Site home page Giter Site logo

noaa-afsc / nfs-berpac Goto Github PK

View Code? Open in Web Editor NEW
7.0 5.0 0.0 407.13 MB

Telemetry data and analysis from bio-logger deployments on northern fur seals in the Bering and Pacific. Maintained by Josh London (@jmlondon / [email protected])

Home Page: https://noaa-afsc.github.io/nfs-berpac/

License: Creative Commons Zero v1.0 Universal

R 100.00%
animal-movement-tracking marine-mammals northern-fur-seals alaska california protected-species

nfs-berpac's Introduction

Northern Fur Seal Movement, Dive Behavior, and Oceanography in the Bering Sea and Pacific Ocean

Project Summary

In 2018, the AFSC Polar Ecosystems Program and Wildlife Computers began collaboration on a project to develop a new family of SPLASH bio-loggers that would provide a small footprint device capable of FastLoc GPS, ocean temperature profiles, and on-board dive behavior summaries that provide unbiased data and efficient transmission via Argos. Due to the COVID-19 pandemic, the intended deployments on ribbon and spotted seals in 2020 were not possible, and in the spring of 2022, only 2 of the 25 devices available were deployed. Instead of letting these devices sit on the shelf for 2+ years, we have developed a plan to deploy a portion of them on northern fur seals at San Miguel Island and northern fur seals at St. Paul Island. At least 10 devices will be reserved for future deployments on ice seals via collaborative research with Alaska Native communities. These tags were originally funded by the US Navy in 2017.

These tags employ two key enhancements that have not been previously available:

  1. Improved ocean temperature profile capabilities with a dedicated external thermistor probe
  2. On-board processing of time-at-depth data to create a more efficient dive behavior summary message that relies on empirical cumulative distributions instead of the typical histogram bin approach used for decades.

Because relatively few devices with these new capabilities have been deployed (by MML or other researchers), there is a great need to observe how these new devices perform, to evaluate how efficient the data transmissions are, and to learn from real-world experience so future deployments can be optimized for a range of study questions.

Visit the complete briefing paper and proposal for additional details

Deployment Details

In September 2022, SPLASH bio-loggers were deployed on 12 adult female northern fur seals at San Miguel Island, California (n=6) and St. Paul Island, Alaska (n=6).

This Google Sheet lists deployment and animal details.

Preliminary Information

Preliminary analysis and visualizations are available via the project webpage. The information on this page is automatically updated every hour and without any significant quality checks. Errors may be present and are likely. We are providing access to the information in a public forum in the spirit of open science and to provide easy access to updated information for the authors and interested researchers.


Disclaimer

The scientific results and conclusions, as well as any views or opinions expressed herein, are those of the author(s) and do not necessarily reflect the views of NOAA or the Department of Commerce.

This repository is a scientific product and is not official communication of the National Oceanic and Atmospheric Administration, or the United States Department of Commerce. All NOAA GitHub project code is provided on an ‘as is’ basis and the user assumes responsibility for its use. Any claims against the Department of Commerce or Department of Commerce bureaus stemming from the use of this GitHub project will be governed by all applicable Federal law. Any reference to specific commercial products, processes, or services by service mark, trademark, manufacturer, or otherwise, does not constitute or imply their endorsement, recommendation or favoring by the Department of Commerce. The Department of Commerce seal and logo, or the seal and logo of a DOC bureau, shall not be used in any manner to imply endorsement of any commercial product or activity by DOC or the United States Government.

nfs-berpac's People

Contributors

jmlondon avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

nfs-berpac's Issues

Predicted paths are sometimes wildly wrong for STP deployment

Occasionally, some (maybe just one) of the model predictions for St Paul produce tracks that are wildly incorrect. For example, see the following screenshots:

Screen Shot 2022-11-01 at 9 40 25 AM
Screen Shot 2022-11-01 at 9 41 21 AM

I've confirmed it's not an artifact of the interactive mapping package used on the website as the same loop is apparent when plotting within R:

image

create deployid values for each deployment

We need a deployid that uniquely identifies each deployment across the Wildlife Computers Data Portal, PEP Postgres Database, US Animal Telemetry Network, and any AEP or CCEP internal databases.

The standard approach within PEP has been to concatenate the Animal ID (speno) with the tag's serial number. My suggestion would be to take the same approach for this project. However, @Tony-Orr or @cekuhn might have other ideas we should consider.

update deployment coordinates - San Miguel

The currently listed [deployment](the deployment sheet currently only has dates entered. Ideally, we'd have date-time values (in GMT) for release so I can initiate the FastlocGPS based on that time (along with the deployment location coordinates)) coordinates (34.03, -122.04) are ~100mi west of San Miguel Island. Ideally these coordinates would be at the beach location where the seals were released.

Consider leaflet for web-map instead of mapdeck

The current implementation of the web maps rely on the rdeck package and a custom mapbox style. Visually, it is appealing. But, there are few drawbacks:

  • there are more examples of more customization with the leaflet framework
  • leaflet framework may allow different projections
  • the ESRI World Ocean Basemap provides more detail and information regarding the bathymetry.
  • leaflet provides easier support for additional raster layers (e.g. cold pool, sst)

update deployment details on wildlife computers data portal

Deployment details (deployid, release times, release coordinates) need to be updated on the WC Data Portal. However, we're holding on this until all data for these PTTs is merged into a single record under the Argos sub-program (10522).

empty CSV files downloaded from Wildlife Computers API

The number of build errors resulting from empty data files coming from Wildlife Computers API has increased in the last few weeks. It was previously 1-2x (hours) per day but in recent days/week it has increased to 6-10x (hours) per day.

This coincides with a separate note from @StacieKozHardy that she's seen similar issues in downloads of PEP telemetry data.

I suspect this issue resides within the wcUtils package and how it interacts with the Wildlife Computers API. It may also be that there are some limitations or other issues with the Wildlife Computers API causing the return of empty data.

develop automated reporting via Quarto

This project could serve as a development space for exploring/learning approaches and workflows for automated reporting of near realtime telemetry data. Ideally, we would converge on a process that could be applied to other telemetry projects.

Possible Approaches:

  • Quarto Document, Manually Rendered, Emailed Manually
  • Quarto Document, Automatically Rendered via NMFS RStudio Connect, Published to RStudio Connect
  • Quarto Document, Automatically Rendered via Github Actions, Published to Github Pages
  • Quarto Document, Automatically Rendered and Published via unknown TBD process

Initial R Development Work:

Create downloadable CSV of the raw dive behavior data

@cekuhn suggested it would be nice to have a way to download and explore the dive behavior in more details. This is especially important b/c the 'raw' data from the WC portal is pretty difficult to interpret.

The idea, here, would be to create a CSV with deployid, date/time/hour, and a column for each depth bin. The values would be proportion of time submerged as is shown in the plots.

Creating the CSV should be pretty easy. Creating the download option might take some Googling and hacking.

Setup GitHub Actions to Render and Publish to gh-pages

For now, we'll go with GitHub Actions + GitHub Pages to render the report. I've already setup renv and an initial GH Actions workflow.

For this, we'll need to make the repo public (for gh pages and for including secrets to connect with wildlife computers API).

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.