Giter Site home page Giter Site logo

kekyo / flashcap Goto Github PK

View Code? Open in Web Editor NEW
166.0 9.0 27.0 2.57 MB

Independent video frame capture library on .NET/.NET Core and .NET Framework.

License: Apache License 2.0

C# 97.13% Batchfile 0.13% Shell 0.84% F# 1.90%
image capture dotnet csharp independent directshow video-for-windows frame-grabber directshow-camera v4l2

flashcap's People

Contributors

awiswasi avatar dependabot[bot] avatar gplwhite avatar kakulukiya45t654645 avatar kekyo avatar muny avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

flashcap's Issues

Add support for video rotation

Hi,

I have been using FlashCap for one of my personal projects recently and I'm really happy with it and the performance also seems to be really good from the first impressions. However, one option that I'm really missing is to rotate the video. For some reason my camera delivers the feed with a 90 degrees rotation and I cannot find a way to make this work. Locally (Windows) I have it working using Magick.NET, but on my Raspberry Pi, which is where the app is running on, this doesn't work. This made me wonder if this feature is something that should be built into this library.

I'd like to hear your opinion on this.

Improve performance on 4k preview

I have a webcam that supports 4k 30fps. So far the only preview solution I can find in .NET that compares to the Windows Camera app and OBS is the UWPs MediaCapture and MediaElement . Using these controls on WPF works for the most part but the airspace issue hits you and you have to make workarounds when incorporating on your UI.

I love the simplicity of this library and was surprised that it even supports Avalonia. It would be great to have 4k preview support in the future.

A generic error occurred in GDI+ when process MJPEG on Winforms

Exception thrown: 'System.Runtime.InteropServices.ExternalException' in System.Drawing.Common.dll
System.Runtime.InteropServices.ExternalException (0x80004005): A generic error occurred in GDI+.
at System.Drawing.Image.ValidateImage(IntPtr image)
at System.Drawing.Image.FromStream(Stream stream, Boolean useEmbeddedColorManagement, Boolean validateImageData)
at System.Drawing.Image.FromStream(Stream stream)
at FlashCap.WindowsForms.MainForm.OnPixelBufferArrived(PixelBufferScope bufferScope) in C:\temp\FlashCap-main\FlashCap-main\samples\FlashCap.WindowsForms\MainForm.cs:line 93
at FlashCap.FrameProcessors.DelegatedQueuingProcessor.ThreadEntry() in C:\temp\FlashCap-main\FlashCap-main\FlashCap.Core\FrameProcessors\QueuingProcessor.cs:line 159

Support Virtual Webcam

Hello,

Is it possible to connect to a 'Virtual Webcam'?
I tried the sample but it could not find the OBS Virtual Camera, which I believe is a DirectShow 'device'.

Is this supposed to work or easy to implement?

Sincerely,

Consider remove aggressive inlining for small method

The code can be simplified by avoiding MethodImpl with AggressiveInlining at least for simple methods since the JIT compiler is smart enough to do that automatically at least for .NET/.NET Core (has no idea about .NET Framework and cannot check it). There's a rule for small methods which IL size is lower or equal to 32 bytes, and attributed methods I checked just reroute execution. There are additional rules on flow graphs, but again, these methods are very simple and should be automatically inlined.

How to detect if camera is being used by other applications?

The only method I could think of is this pseudocode. Is there a more reliable/better way to do this?

Load
  start camera
  cameraInUseTimer.Interval = 1000;
  cameraInUseTimer.Elapsed += CameraInUseTimer_Elapsed;
  cameraInUseTimer.Start();

CameraInUseTimer_Elapsed
  // Camera In Use

OnPixelBufferArrivedAsync
   cameraInUseTimer.Stop()

I also need to know why the Camera is unable to start

A generic error occurred in GDI+

I run FlashCap.WindowsForms app on Visual Studio 2022 and see only exceptions that thrown when the image is restored from the stream:

System.Runtime.InteropServices.ExternalException (0x80004005): A generic error occurred in GDI+.
at System.Drawing.Image.FromStream(Stream stream, Boolean useEmbeddedColorManagement, Boolean validateImageData)
at System.Drawing.Image.FromStream(Stream stream)
at FlashCap.WindowsForms.MainForm.OnPixelBufferArrived(PixelBufferScope bufferScope) in C:\repos\temp\FlashCap-main\samples\FlashCap.WindowsForms\MainForm.cs:line 89
at FlashCap.FrameProcessors.DelegatedQueuingProcessor.ThreadEntry() in C:\repos\temp\FlashCap-main\FlashCap.Core\FrameProcessors\QueuingProcessor.cs:line 159

May be I am doing something wrong?

PS: FlashCap.Avalonia works fine

change camera settings

The library is phenomenal - fast and light!

I have a question, do you plan to add functions to change settings in cameras (brightness, contrast, exposure and other options that the cameras support)?

run wpf sample app

When compiling and running the FlashCap.Wpf application I get a runtime error:

System.IO.FileLoadException: 'Could not load file or assembly 'System.Runtime.CompilerServices.Unsafe, Version=4.0.4.0

The only way to solve it was to add the package reference (latest version 6.0.0)
System.Runtime.CompilerServices.Unsafe
and update Epoxy.Wpf to the latest version 1.11.0

Happy to push the change if needed.

Video capturing on Mac devices

First of all I would like to thank you for the library. It's really great and has very clean API (: I had an almost complete implementation for capturing V4L2 video, but when I found FlashCap, without any doubt removed the results of several days of my work.

There's only thing missed in my opinion which can be supported too - video capturing on Mac devices. There are examples in the official Apple documentation, but using Objective-C, and bindings can be taken from Xamarin (located in Xamarin.Mac).

If there's nothing other urgent in the project I'm working on, I can contribute it in foreseeable future.

没有按照 Characteristics 里的 FPS 回调获取画面

image


private static async void Start(int selectCameraIndex,int selectCharacteristicsIndex)
{
	var devices = new CaptureDevices();
	var descriptor0 = devices.EnumerateDescriptors().ElementAt(selectCameraIndex);

	Console.WriteLine($"Selected camera :{descriptor0.Name}  Characteristic:{descriptor0.Characteristics[selectCharacteristicsIndex]}");
	using var device = await descriptor0.OpenAsync(
		descriptor0.Characteristics[selectCharacteristicsIndex],
		  bufferScope =>
		{
			Console.WriteLine($"Timestamp:{bufferScope.Buffer.Timestamp}  FrameIndex:{bufferScope.Buffer.FrameIndex}");
		});

	await device.StartAsync();

	Console.ReadKey();

	await device.StopAsync();
}

使用的是 1 FPS 的 Characteristic 实际是按照 25 FPS 返回的画面

Generate video

Hi!

Is there way to save captured frames as video?
I didn't find anything related to this topic.
Maybe FlashCap has native tools to do that?

For now, I'm saving frames as .jpg files then compiling them into video through FFmpeg.

Blue screen error without stopping

Hi team,
If I stop the project via visual studio without stopping the captured device, I get a blue screen error on the computer. I think it's a general problem with Directshow.
Do you have any suggestions for this?
The error I get is "SYSTEM THREAD EXCEPTION NOT HANDLED"

How everyone is using it?

This issue is purely due to my interest. What are you applying this library to? Please leave a note if you don't mind.
Being able to know the situsations makes me realize that I have a growing user base, motivates me to develop more, and makes dinner taste better! 😄

Missing Camera Characteristics

Hi,
I did non a lot of research and testing and I am still running into the following issue.

So I have a Logitech Stream Cam connected via USB-C 3.0
The sample Code to print the camera characteristics works fine, but I am missing the 60 FPS selection (see screenshot)
image

For comparison in the windows Camera App there is the option to select 60FPS.
Sorry if I am missing something obvious, but I don't see any solution to this problem.

For my project idea I absolutely need 60 FPS - Any Ideas why the DircetShow pro. for 60 FPS is missing?
Thanks in advance
Kind regards
M.

Capture Error event handler

Hi Team,
I can't find capture error event handler ? Is it possible ?
For example, I have usb capture device.If I unplug the usb cable from the computer, I expect a capture error to occur while capturing.
Thanks in advance.

Issue regarding order of known fmtdesc.pixelformat determines if all types are found

I have a new Logitech C920. This camera breaks the CollectWhile loop, as MJPG is supported. But because it is listed affter the H.264 it is never found.

image

private static IEnumerable<v4l2_fmtdesc> EnumerateFormatDesc(
        int fd) =>
        Enumerable.Range(0, 1000).
        CollectWhile(index =>
        {
            var fmtdesc = Interop.Create_v4l2_fmtdesc();
            fmtdesc.index = (uint)index;
            fmtdesc.type = (uint)v4l2_buf_type.VIDEO_CAPTURE;
            
            return
                ioctl(fd, Interop.VIDIOC_ENUM_FMT, fmtdesc) == 0 &&
                IsKnownPixelFormat(fmtdesc.pixelformat) ?
                (v4l2_fmtdesc?)fmtdesc : null;
        }).
        ToArray();   // 

Capture works first time on Linux, but not subsequent times

Was working on #133 and ran into an issue.

With a fresh main branch, I run the FlashCap.OneShot sample successfully:

cat ➜  net8.0 git:(main) ✗ ./FlashCap.OneShot
Selected capture device: USB3.0 UHD: USB3.0 UHD: usb-0000:00:14.0-1: uvcvideo, Characteristics=55, 1920x1080 [YUYV, 60.000fps]
Captured 6220854 bytes.
The image wrote to file oneshot.bmp.

However, after subsequent attempts to run the sample, it crashes at the YUV trancoder:

cat ➜  net8.0 git:(main) ✗ ./FlashCap.OneShot
Selected capture device: USB3.0 UHD: USB3.0 UHD: usb-0000:00:14.0-1: uvcvideo, Characteristics=55, 1920x1080 [YUYV, 60.000fps]
Fatal error. System.AccessViolationException: Attempted to read or write protected memory. This is often an indication that other memory is corrupt.
   at FlashCap.Internal.BitmapTranscoder+<>c__DisplayClass1_0.<TranscodeFromYUVInternal>b__0(Int32)
   at System.Threading.Tasks.Parallel+<>c__DisplayClass19_0`2[[System.__Canon, System.Private.CoreLib, Version=8.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e],[System.Int32, System.Private.CoreLib, Version=8.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e]].<ForWorker>b__1(System.Threading.Tasks.RangeWorker ByRef, Int64, Boolean ByRef)
   at System.Threading.Tasks.TaskReplicator+Replica.Execute()
   at System.Threading.ExecutionContext.RunFromThreadPoolDispatchLoop(System.Threading.Thread, System.Threading.ExecutionContext, System.Threading.ContextCallback, System.Object)
   at System.Threading.Tasks.Task.ExecuteWithThreadLocal(System.Threading.Tasks.Task ByRef, System.Threading.Thread)
   at System.Threading.ThreadPoolWorkQueue.Dispatch()
   at System.Threading.PortableThreadPool+WorkerThread.WorkerThreadStart()
[1]    18403 abort (core dumped)  ./FlashCap.OneShot

The only way to let it work again is to unplug my UVC device, then plug it back in. It then works for 1 frame, and no more.

After a little digging, I discovered that after here:

if (ioctl(this.fd, Interop.VIDIOC_DQBUF, buffer) < 0)

The value of buffer.bytesused is 0.

image

If I change (int)buffer.bytesused to (int)buffer.length (which are the same value, except for when bytesused is 0) in the call to this.frameProcessor.OnFrameArrived, I can successfully capture multiple times.

It's not immediately clear to me what could be causing this. I don't know what the difference between bytesused and length is.

V4L2: Couldn't map video buffer: Code=22

I have been using FlashCap 1.5.0 on a Raspberry Pi 4 without any issues.
Now I upgraded the OS from Raspbian 10 (buster) to Raspbian 11 (bullseye) aka. the newest Raspberry Pi OS.
Unfortunately I'm now unable to connect to my camera.

CAM: Arducam OV9782 USB Camera: Ardu: usb-0000:01:00.0-1.3: uvcvideo, Characteristics=25
CAM: 640x480 [JPEG, 30.000fps]
Unhandled exception. System.ArgumentException: FlashCap: Couldn't map video buffer: Code=22, DevicePath=/dev/video0
   at FlashCap.Devices.V4L2Device.<>c__DisplayClass14_0.<OnInitializeAsync>b__0() in /_/FlashCap.Core/Devices/V4L2Device.cs:line 144
   at System.Threading.Tasks.Task.InnerInvoke()
   at System.Threading.Tasks.Task.<>c.<.cctor>b__272_0(Object obj)
   at System.Threading.ExecutionContext.RunFromThreadPoolDispatchLoop(Thread threadPoolThread, ExecutionContext executionContext, ContextCallback callback, Object state)
--- End of stack trace from previous location ---
   at System.Threading.ExecutionContext.RunFromThreadPoolDispatchLoop(Thread threadPoolThread, ExecutionContext executionContext, ContextCallback callback, Object state)
   at System.Threading.Tasks.Task.ExecuteWithThreadLocal(Task& currentTaskSlot, Thread threadPoolThread)
--- End of stack trace from previous location ---
   at FlashCap.CaptureDeviceDescriptor.InternalOnOpenWithFrameProcessorAsync(CaptureDevice preConstructedDevice, VideoCharacteristics characteristics, Boolean transcodeIfYUV, FrameProcessor frameProcessor, CancellationToken ct) in /_/FlashCap.Core/CaptureDeviceDescriptor.cs:line 85
   at Core.WebcamAcquisition.SetupAndStart(CancellationToken token) in D:\Source\Core\Core\WebcamAcquisition.cs:line 51
   at Core.Program.Main(String[] args) in D:\Source\Core\Core\Program.cs:line 251
   at Core.Program.<Main>(String[] args)
Aborted

I have looked at the source of FlashCap, but that doesn't bring me closer to understand what the error code 22 is about.

Source of simple problem that throws the above exception:

            var devices = new CaptureDevices();
            var descriptors = devices.EnumerateDescriptors().
                Where(d => d.Characteristics.Length >= 1).             // One or more valid video characteristics.
                ToArray();

            if (descriptors.ElementAtOrDefault(0) is { } descriptor0)
            {
                var characteristics = new VideoCharacteristics(PixelFormats.JPEG, 640, 480, 30);

                // Show status.
                m_logger.Post(new LogMessage("CAM: " + descriptor0.ToString(), LogMessage.LogLevel.Information));
                m_logger.Post(new LogMessage("CAM: " + characteristics.ToString(), LogMessage.LogLevel.Information));

                m_captureDevice = await descriptor0.OpenAsync(
                    characteristics,
                    OnPixelBufferArrivedAsync);

                // Start capturing.
                await m_captureDevice.StartAsync(token);
            }
            else
            {
                m_logger.Post(new LogMessage("CAM: Device Not found", LogMessage.LogLevel.Error));
            }

I tried with different cameras, but that doesn't help. The cameras are working fine in other program on the Pi with the new OS.

Any suggestions?

Need support with Avalonia Sample

Hello there,

i am currently trying to get FlashCap running in an Avalonia 11 Application using MVVM Community Toolkit.
I am using the provided Sample to get something going, but i am somewhat confused about what to do with the provided sample methods in the ViewModel like: OnDeviceListChangedAsync, OnCharacteristicsChangedAsync, OnPixelBufferArrivedAsync...
Are these methods supposed to be used as commands for the ComboBoxes?
From the sample code, it is not clear to me where these Methods get called.

Could someone explain a little what is going on in that sample?

On Linux at least cant close than reopen Camera

An exception of type 'System.ArgumentException' occurred in System.Private.CoreLib.dll but was not handled in user code: 'FlashCap: Couldn't set video format [3]: DevicePath=/dev/video0'
at FlashCap.Devices.V4L2Device..ctor(String devicePath, VideoCharacteristics characteristics, Boolean transcodeIfYUV, FrameProcessor frameProcessor)
at FlashCap.Devices.V4L2DeviceDescriptor.OpenWithFrameProcessorAsync(VideoCharacteristics characteristics, Boolean transcodeIfYUV, FrameProcessor frameProcessor)
at FlashCap.CaptureDeviceDescriptorExtension.OpenAsync(CaptureDeviceDescriptor descriptor, VideoCharacteristics characteristics, PixelBufferArrivedDelegate pixelBufferArrived)
at MainForm.d__18.MoveNext() in /home/mike/Desktop/TimelapseNow/Timelapse.Desktop/Program.cs:line 294

Works otherwise (as far as I can see)

VideoCharacteristics not detected

Hello everyone!👋👋👋
I used an example from the project description. I haven't decided on the characteristics for the video.

My software and hardware information

image
image

Aditional Context

using FlashCap;

// Capture device enumeration:
var devices = new CaptureDevices();
var descriptor0 = devices.EnumerateDescriptors().ElementAt(0);

byte[] imageData = await descriptor0.TakeOneShotAsync(
    descriptor0.Characteristics.FirstOrDefault());

// Save to file
await File.WriteAllBytesAsync("oneshot", imageData);

image
image

Avalonia 11: MVVM Community Toolkit: ObservableCollection<CaptureDeviceDescriptor> is null

Hello,

i am having a little trouble using FlashCap with Avalonia 11 and the MVVM Community Toolkit.

I mostly used the Avalonia code sample to get FlashCap up and running, but ran into a problem.

The original sample code looks like this:
`
public Command? Opened { get; }
public SKBitmap? Image { get; private set; }
public bool IsEnbaled { get; private set; }

public ObservableCollection<CaptureDeviceDescriptor?> DeviceList { get; } = new();
public CaptureDeviceDescriptor? Device { get; set; }

public ObservableCollection<VideoCharacteristics> CharacteristicsList { get; } = new();
public VideoCharacteristics? Characteristics { get; set; }

public string? Statistics1 { get; private set; }
public string? Statistics2 { get; private set; }
public string? Statistics3 { get; private set; }

public MainWindowViewModel()
{
    // Window shown:
    this.Opened = Command.Factory.CreateSync(() =>
    {
        ////////////////////////////////////////////////
        // Initialize and start capture device

        // Enumerate capture devices:
        var devices = new CaptureDevices();

        // Store device list into the combo box.
        this.DeviceList.Clear();
        this.DeviceList.Add(null);

        foreach (var descriptor in devices.EnumerateDescriptors().
            // You could filter by device type and characteristics.
            //Where(d => d.DeviceType == DeviceTypes.DirectShow).  // Only DirectShow device.
            Where(d => d.Characteristics.Length >= 1))             // One or more valid video characteristics.
        {
            this.DeviceList.Add(descriptor);
        }

        this.IsEnbaled = true;
    });
}

`

The code i am currently using looks like this:
`
private long frameCount;
private ImageCaptureWindow? _currentWindow;
private CaptureDevice? _captureDevice;

    [ObservableProperty] private string? _statistics1;
    [ObservableProperty] private string? _statistics2;
    [ObservableProperty] private string? _statistics3;

    [ObservableProperty] private bool? _isEnabled;
    [ObservableProperty] private bool? _windowIsOpen;

    [ObservableProperty] [NotifyPropertyChangedFor(nameof(Device))] private ObservableCollection<CaptureDeviceDescriptor>? _devices;
    [ObservableProperty] [NotifyPropertyChangedFor(nameof(Characteristic))] private ObservableCollection<VideoCharacteristics>? _characteristics;

    [ObservableProperty] private CaptureDeviceDescriptor? _device;
    [ObservableProperty] private VideoCharacteristics? _characteristic;
    [ObservableProperty] private SKBitmap? _image;



    public ImageCaptureWindowViewModel()
    {
        // Enumerate capture devices:
        var devices = new CaptureDevices();

        // Store device list into combobox:
        Devices?.Clear();
        Devices = null;

        // Get device descriptor
        foreach (var descriptor in devices.EnumerateDescriptors().
                     Where(d => d.DeviceType == DeviceTypes.DirectShow))
        {
            Devices?.Add(descriptor);
        }

        IsEnabled = true;
    }

`

My particular problem is that...
Devices?.Add(descriptor);
...does not add the descriptor to...
[ObservableProperty] [NotifyPropertyChangedFor(nameof(Device))] private ObservableCollection<CaptureDeviceDescriptor>? _devices;

The descriptor is NOT null, but it is not being added to the collection.

To me it is not obvious why that is the case.
Does anyone have any tips what is causing this?

Just in case someone does not know, MVVM Community Toolkit is a similar MVVM framework than Epoxy.
It auto-generates the public property 'Devices' from the private observable property '_devices'.

Any help would be appreciated.

Added ShowPropertyPage method to DirectShowDevice class

From PR #110

We have had several suggestions and attempts to change the DS properties page to show changes in the past.
#14, #42, #50, #64

For my part, I had hoped to have a programmable (key-value-like) property interface and make it multi-platform compatible. However, since the fix has been stalled and I have not been able to take the time to help with this contribution myself, it has been effectively abandoned.

So, once I accept this modification, i.e., make the properties page viewable only in the DS environment, I will consider later to provide a programmable configuration interface as a whole.

How to capture one image

I notice you removed the captureone function, could you please give a sample how to only capture one image? Thanks!

Reversed logic when enumerating Linux device capabilities

I've been trying to diagnose why I cannot get Flashcap to correctly enumerate a built-in camera on a Raspberry PI (model 3 B+).

During my investigation I think I've discovered a flaw in the logic for enumerating the device capabilities - specifically related to the enumeration of the FramesPerSecond.

As commented in the code, while the V4L2 api reports the value as a time interval, Flashcap inverts this value to a FramesPerSecond value.
The minimum and maximum intervals are converted to their FPS value and used to filter a list of standardised frame rates.

However, the problem is that while the interval values have been inverted, the minimum and maximum values are still treated as a minimum and maximum when filtering the standardised rates. But the minimum and maximum meanings should also be swapped.

For example, if the interval values reported by the V4L2 api is:

min: 10/1
max: 30/1

this would be converted to FPS values of

min: 1/10
max: 1/30

1/10 is a larger value than 1/30 so the minimum and maximum meanings should be reversed:

min: 1/30
max: 1/10

Because the minimum value is larger than the maximum value, the fps >= min && fps <= max filtering constraint can never be true and no FramesPerSecond values are returned.

Elgato CamLink 4K DirectShow Pixel Format

Before I describe the issue, I want to say that I very much appreciate your effort on this project. It's been extremely useful, and it's so helpful that it's cross-platform.

Now to the issue:

I see that the Elgato CamLink 4K is on the list of verified devices, however it seems to not be working for me.

I'm developing on Win11 with .NET 8.0.

The device enumerates as a DirectShow device with one characteristic:
3840x2160 [Unknown, 29.970fps]

It seems FlashCap doesn't support the pixel format reported by the device.
Value of RawPixelFormat:
3231564e-0000-0010-8000-00aa00389b71

Looking at https://gix.github.io/media-types, it seems this format is a YUV variant called NV12.
(side note: I found this sample code for converting NV12 to RGB: https://paulbourke.net/dataformats/nv12)

I'm curious, when the CamLink 4K was tested before, what pixel format did it report?

FlashCap.Avalonia 拖拽窗口时视频不播放 (Video does not play when dragging the window)

在鼠标在标题栏按住不放或者拖拽移动窗口的时候,视频不播放
(When the mouse is held down in the title bar or dragged to move the window, the video will not play)

日志显示,新的图片数据有更新给viewmodel
(The log shows that new image data has been updated to the viewmodel)

        // Update a bitmap.
        Dispatcher.UIThread.Post(() =>
        {
            this.Image = bitmap;
            // Update statistics.
            var realFps = countFrames / timestamp.TotalSeconds;
            var fpsByIndex = frameIndex / timestamp.TotalSeconds;
            this.Statistics1 = $"Frame={countFrames}/{frameIndex}";
            this.Statistics2 = $"FPS={realFps:F3}/{fpsByIndex:F3}";
            this.Statistics3 = $"SKBitmap={bitmap.Width}x{bitmap.Height} [{bitmap.ColorType}]";
            Debug.WriteLine("updated a bitmap.");
        }, DispatcherPriority.MaxValue);

所以这是 SkiaImageView.Avalonia 的问题?还是Avalonia 的问题?
(So this is the issue with SkiaImageView.Avalonia? Or is it Avalonia's problem?)

Avalonia.GIF 鼠标在标题栏按住不放或者拖拽移动窗口的时候,动画是正常播放的
(Avalonia.GIF When the mouse is held down in the title bar or dragged to move the window, the animation plays normally)

Question about TranscodeFromUYVY method

The method TranscodeFromUYVY in BitmapTranscoder.cs:22 contains a YUV to RGB conversion.
As everything in the method is converted to integer arithmetic it is not obvious to me which color space
conversion is in use. There are two main standards BT.709 (usually used for HD formats) and BT.601 (usually used for SD formats).
Can anybody tell me which standard is used here? It would be cool to provide a parameter to the method or make different methods for the various color spaces, also in view of 4k/UHD which is coming with yet another standard (BT.2020).

Thank you!

Support for Blackmagic WDM Device

Hi all,

first of all great library! I experimented a little bit and found an issue with the infinite pixel formats that are out there. I am testing with a BlackMagic SDI input card, their driver includes a DirectShow filter (Blackmagic WDM Capture). It did not work out of the box, so I checked with the debugger and added a fourcc code:
HDYC = 0x43594448
which is basically "PixelFormats.UYVY". So adding a few more switches around the code worked for me.
I forked the repos and pushed my mods here:
https://github.com/flat-eric147/FlashCap
Would be great if this little patch could make it into the main repos!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.