Giter Site home page Giter Site logo

baggepinnen / blobtracking.jl Goto Github PK

View Code? Open in Web Editor NEW
36.0 4.0 2.0 582 KB

Detect and track blobs in video

License: MIT License

Julia 100.00%
blob-tracking blob-detection computer-vision kalman-tracking track-blobs detect-blobs video-processing object-tracking object-detection bird-tracking

blobtracking.jl's Introduction

BlobTracking

Build Status Codecov

Detect and track blobs (like birds or bugs) moving around in an image. Blobs are detected using simple Laplacian-of-Gaussian filtering (from Images.jl) and tracked using a Kalman filter from LowLevelParticleFilters.jl.

This package contains some facilities for the aforementioned detection and tracking, as well as some utilities for background removal etc.

Usage

In the example below, we are tracking birds that fly around a tree.

Load a video

using BlobTracking, Images, VideoIO
path = "/home/fredrikb/Video/2_small.MP4"
io   = VideoIO.open(path)
vid  = VideoIO.openvideo(io)
img  = first(vid)

window

this package implements an iterator for VideoIO videos. It only iterates black and white images, even if the original video is in color.

Create a background image

We create a background image to subtract from each image

medbg = MedianBackground(Float32.(img), 4) # A buffer of 4 frames
foreach(1:4) do i # Populate the buffer
    update!(medbg,Float32.(first(vid)))
end
bg = background(medbg)

Create a mask

If you want to detect birds (blobs) in the entire image, you can skip this step.

A mask is a binary image that is true where you want to be able to detect blobs and false where you want to ignore.

mask = (bg .> 0.4) |> reduce(, fill(erode, 30)) |> reduce(, fill(dilate, 20))
mask[:,1190:end] .= 0
mask[end-50:end,:] .= 0

window

Preprocessing

For the tracking to work well, it's important that we feed the tracker nice and clean images. An example of a pre-processing function looks like this, it takes a storage array you can operate on in-place and the image to pre-process.

function preprocessor(storage, img)
    storage .= Float32.(img)
    update!(medbg, storage) # update the background model
    storage .= Float32.(abs.(storage .- background(medbg)) .> 0.4) # You can save some computation by not calculating a new background image every sample
end

window Notice how the tree contours are still present in this image? This is okay since that is behind the mask we created above. The mask was created by dilating the tree slightly so that the mask covers slightly more than the tree. However, in this image you can also see two small spots to the right of the tree, representing birds.

Run tracking

We now create the BlobTracker and run the tracking. If we don't know an appropriate value for the sizes vector that determines the size scales of the blobs, we may call the function tune_sizes to get a small GUI with a slider to help us out (works in Juno and IJulia). The length of sizes has a large impact on the time it takes to process each frame since the majority of the processing time is taken up by the blob detection.

bt = BlobTracker(3:3, #sizes 
                2.0, # σw Dynamics noise std.
                10.0,  # σe Measurement noise std. (pixels)
                mask=mask,
                preprocessor = preprocessor,
                amplitude_th = 0.05,
                correspondence = HungarianCorrespondence(p=1.0, dist_th=2), # dist_th is the number of sigmas away from a predicted location a measurement is accepted.
)
tune_sizes(bt, img)

result = track_blobs(bt, vid,
                         display = Base.display, # use nothing to omit displaying.
                         recorder = Recorder()) # records result to video on disk

To display images in a standalone window with okay performance, consider

using ImageView
c = imshow(img)
displayfun = img -> imshow!(c["gui"]["canvas"],img);
track_blobs(...; display = displayfun)

Blobs are shown in blue, newly spawned blobs are show in green and measurements are shown in red.If everything is working well, most blue dots should have a red dot inside or very nearby. If the blue blobs are lagging behind the red dots, the filter needs tuning by either decreasing the measurement variance or increasing the dynamics variance. If blue dots shoot off rapidly whenever measurements are lost, the dynamics variance should be decreased.

If you do not want to run the tracking and instead only collect all coordinates of detected blobs, you may call

coords = get_coordiantes(bt, vid)

you can then later call the tracking function like result = track_blobs(bt,coords), but if invoked like this, you do not have the option to display or record images.

Visualization etc.

traces = trace(result, minlife=5) # Filter minimum lifetime of 5
measurement_traces = tracem(result, minlife=5)
drawimg = RGB.(img)
draw!(drawimg, traces, c=RGB(0,0,0.5))
draw!(drawimg, measurement_traces, c=RGB(0.5,0,0))

window

In the image, green dots represent spawning positions and red dots the last obtained measurement for a blob in case of the red measurement traces, and the point at which the blob was killed in case of the blue location traces.

Below is a youtube video showing how it looks Video illustration

Further documentation

Most functions have docstrings. Docstrings of types hint at what functions you can call on instances of the type. The types present in this package are

  • Blob represents a Blob, contains traces of locations and measurements as well as the Kalman filter
  • BlobTracker contains parameters for the tracking and correspondence matching
  • KalmanParams stores the variance parameters for the KalmanFilter.
  • AbstractCorrespondence
    • HungarianCorrespondence matches blobs to measurements using the Hungarian algorithm
    • NearestNeighborCorrespondence matches blobs to the nearest measurement
    • MCCorrespondence uses Monte Carlo integration over the filtering distribution of the blobs and matches blobs to measurements several times using the chosen inner AbstractCorrespondence.
  • TrackingResult contains lists of dead and alive blobs
  • Trace is a list of coordinates
  • Recorder records movies and saves them on disk
  • FrameBuffer stores frames for temporal processing
  • BackgroundExtractor
    • MedianBackground models the background of an image
    • DiffBackground models the background of an image
  • Workspace is used internally

blobtracking.jl's People

Contributors

baggepinnen avatar github-actions[bot] avatar juliatagbot avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

Forkers

fenderrex mastrof

blobtracking.jl's Issues

Access start times of tracks

Is there a way to access the time (frame) at which a given track starts?

I was writing a function to produce customized animations (with Plots.jl) of the tracks, and realized that I could not find a direct way to access the time at which a given track begins.

(Following the notation in the bird tracking example) When I evaluate traces = trace(result, minlife=minlife), I get the coordinates that make up each track but the information on the frame at which the first point of each track occurs is lost.
The workaround I'm using right now to find the initial time of a track goes as follows:

for trace in traces
    t0 = findfirst(map(x -> trace[1] ∈ x, result.coordinates))
end

that is, I compare the first point of the track with the list of all coordinates and see the frame where it first occurs.
But it's quite hackish. And of course it will not work well in the general case (different tracks might go through the exact same coordinates at different times).

Since the already implemented methods to output videos are able to display tracks at the right time, I'm sure there is a better way to do this. I just can't find it.

Thanks for help (and thanks for the package!)

problem with BlobTracker and params

Hello!
I had been working in R with video tracking of insects but now I want to migrate to Julia.
I am learning Julia now, and I could not reproduce the example with my own video test.
I am stopped in:

bt = BlobTracker(sizes=3:3,
                mask=mask,
                preprocessor = preprocessor,
                amplitude_th = 0.05,
                correspondence = HungarianCorrespondence(p=1.0, dist_th=2), # dist_th is the number of sigmas away from a predicted location a measurement is accepted.
                σw = 2.0, # Dynamics noise std.
                σe = 10.0)  # Measurement noise std. (pixels)

UndefKeywordError: keyword argument params not assigned

Stacktrace:
 [1] top-level scope at In[5]:2
 [2] include_string(::Function, ::Module, ::String, ::String) at .\loading.jl:1091

As I said, I am new in Julia. Where is the wrong step? The prepocessor function maden in the previous step needs something else?

I am working in Win10 with Atom and Julia 1.5.1.

Todo

  • Separate update for tracker
  • Push collections to buffer
  • Collect video
  • Trace for each blob
  • Trace of positions, predictions and measurements
  • Post analysis
  • Composable temporal filters
  • Visualization
  • Auto tuning of covariance, smart defaults for size
  • Correspondence type with Hungarian or nearest neighbor as naive default
  • Replay tracking with different options
  • Run multiple tracking configurations at the same time
  • MonteCarlo correspondence

TagBot trigger issue

This issue is used to trigger TagBot; feel free to unsubscribe.

If you haven't already, you should update your TagBot.yml to include issue comment triggers.
Please see this post on Discourse for instructions and more details.

If you'd like for me to do this for you, comment TagBot fix on this issue.
I'll open a PR within a few hours, please be patient!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.