Giter Site home page Giter Site logo

Comments (26)

mcclanahoochie avatar mcclanahoochie commented on April 27, 2024 3

Hi,

Currently, the Recolor Calculator reads in a Color from the options , and reads those values into color_ during the calculator's Open() funtion, and sends the values to the GPU during InitGPU() .

To enable changing the color every frame, a couple things need to happen:

  1. Create an input_stream of Color values (or an array, or your own custom struct) . This would look similar to the "IMAGE_GPU" packet stream (except with a different datatype).
  2. Read those values into color_ during ProcessGPU (again similar to how it's done here and here )
  3. Send the values to the GPU, via adding
    glUniform3f(glGetUniformLocation(program_, "recolor"), color_[0], color_[1], color_[2]);
    right after the call to glUseProgram(program_); inside of GlRender() .
  4. Also don't forget to register the new stream in GetContract()

After all that is done, you would need another calculator to send in the new color values, and that would be up to you on the logic.

These changes to Recolor Calculator are something we can consider adding in the future, but right now there is no timeline for it.

Hopefully that provides some insight for you.

Cheers,
~Chris

from mediapipe.

sandipan1 avatar sandipan1 commented on April 27, 2024

Hi,
I am not sure how to follow step 2. I made a input_stream "RGB_ARRAY". I added the following lines,
const Packet& rgb_packet = cc->Inputs().Tag("RGB_ARRAY").Value();
const auto& rgb_buffer = rgb_packet.Get<mediapipe::GpuBuffer>();

I intend to make a slider(seekbar) for each of the RGB channels in the app. Do I need another calculator for inputting values from the sliders and output an array to the Recolor Calculator? If so ,how do I send the slider values to the new calculator?
Can you guide me on this

from mediapipe.

mcclanahoochie avatar mcclanahoochie commented on April 27, 2024

Hi,

What data type is the "RGB_ARRAY" ? Let's say it is std::vector, then you should be accessing the data of the Packet via
const Packet& rgb_packet = cc->Inputs().Tag("RGB_ARRAY").Value();
const auto& rgb_array = rgb_packet.Get<std::vector<int>>();
(instead of GpuBuffer).

For the sliders, you could have them all in one calculator, and output the std::vector in one packet stream... or you could have separate calculators with separate streams "RED_VALUE","BLUE_VALUE"...etc, and then modify the Recolor Calculator to accept each color value stream separately
const Packet& red_packet = cc->Inputs().Tag("RED_VALUE").Value();
const auto& red_value = rgb_packet.Get<int>();

....

If you already have some sliders that output values, then I would suggest creating one ColorSliderCalculator that outputs the RGB_ARRAY stream as a vector , and have the color slider calculator accept an InputSidePacket to a pointer to the slider values, then output those values as a packet (rgb vector<>) from the calculator.
InputSidePackets are packets that get sent once at the beginning of the graph run. They are usually used for initialization values or connecting data structures across calculators or other parts of the code (in your case pointers to rgb sliders).
Or, just move your slider code/logic inside of the calculator.

from mediapipe.

sandipan1 avatar sandipan1 commented on April 27, 2024

Hi,
I am having trouble sending the packets from sliders to mediapipe graph. I have made the ColorSliderCalculator which takes in 3 input stream and outputs a vector from calculator. As for the slider logic , this,thisand this holds the slider values . I suppose the next thing I need to do is to create int packets like this . How do I send the packets to the input stream. ? Also let me know if there is any corrections needed in my code.

from mediapipe.

mcclanahoochie avatar mcclanahoochie commented on April 27, 2024

Hi,

your cc->Outputs().Tag('RGB_OUT).Set<std::vector>(); is missing a ' after OUT , but also i would use double quotes for strings.

You will probably want to use the FrameProcessor to get the packet creator, instead of creating a new one. processor.getPacketCreator() that way you associate with the graph.

You can also get access to the graph via the processor, in which you would add your packets to the input stream. See what onNewFrame is doing at addConsumablePacketToInputStream .
In this case, the input stream is defined at the top of the graph as input_video , you would need to add another 3 streams there to accept your color packets.

To keep things simple, i would recommend sending in your custom packets each frame, regardless if the values change, so that calculators don't wait on missing inputs (there is a way to change this behavior later if desired).

from mediapipe.

sandipan1 avatar sandipan1 commented on April 27, 2024

Hi
I have made the changes in BUILD , activity_main.xml ,MainActivity.java , hair_segmentation_mobile_gpu.pbtxt , and created ColorSliderCalculator. Bazel build is working fine . But the apk crashes once the app is launched. Can you please check for possible bugs?

from mediapipe.

mcclanahoochie avatar mcclanahoochie commented on April 27, 2024

Hi,
based on adb logcat:

08-20 21:54:43.104 32312 32333 E AndroidRuntime: RecolorCalculator::GetContract failed to validate:
08-20 21:54:43.104 32312 32333 E AndroidRuntime: For input streams ValidatePacketTypeSet failed:
08-20 21:54:43.104 32312 32333 E AndroidRuntime: Tag "RGB_ARRAY" index 0 was not expected.

I don't see any modification to the Recolor calculator to accept the new rgb array packet stream.

you will need to add this in GetContract to that calculator

  if (cc->Inputs().HasTag("RGB_ARRAY")) {
    cc->Inputs().Tag("RGB_ARRAY").Set<vector::<int>>();
  }

then, in GlRender(), right after the glUseProgram() update
glUniform3f(glGetUniformLocation(program_, "recolor"), my_colors[0], my_colors[1], my_colors[2]); with my_colors being from your new packet stream

also, you are really close in the java side. but putting everything in surfaceChanged is not what you want though, as that is called only once!

you want something like this

 private class RGBHandler implements TextureFrameConsumer {
    @Override
    public void onNewFrame(TextureFrame frame) {
       <blah> sliders update<blah>
   }
}

to receive callbacks each new camera frame

see HERE where i did this in your MainActivity.java with RGBHandler

also, you need to use PacketClonerCalculator in your graph to dupe the slider values each frame, otherwise (like i said previously), the graph will hang waiting for all input streams in the recolor calculator.

see HERE where i modified your graph with PacketCloner

BTW
i tested everything i linked to compiles at least (no guarantee it works), but you need to implement the recolor calculator logic to handle the new slider value streams

^note the above links expire after a month

hope that helps

from mediapipe.

sandipan1 avatar sandipan1 commented on April 27, 2024

Hi
I made the suggested changes.Build the app without errors. This is my latest commit.

However the camera doesn't open and the app looks like this .

from mediapipe.

mcclanahoochie avatar mcclanahoochie commented on April 27, 2024

Likely something is not correct when starting/running the graph.

At a quick glance, I don't see anything too wrong, except the output of ColorSliderCalculator...

Can you use a unique pointer for the output packet of the ColorSliderCalculator ,
(ex: absl::make_unique<std::array<int>> ) ? See the letterbox padding output as reference. It's just a guess, but it's possible the packet data is destroyed since you give a reference &rgb to a local variable to output.

What does logcat say?

from mediapipe.

sandipan1 avatar sandipan1 commented on April 27, 2024

Logcat outputs:
adb logcat |grep com.google.mediapipe.apps.hairsegmentationgpu 08-24 23:08:15.684 2672 2672 I Timeline: Timeline: Activity_launch_request time:146732 intent:Intent { act=android.intent.action.MAIN cat=[android.intent.category.LAUNCHER] flg=0x10200000 cmp=com.google.mediapipe.apps.hairsegmentationgpu/.MainActivity bnds=[238,909][439,1241] } 08-24 23:08:17.091 6907 6907 D AccessibilityManager: current package=com.google.mediapipe.apps.hairsegmentationgpu, accessibility manager mIsFinalEnabled=false, mOptimizeEnabled=false, mIsUiAutomationEnabled=false, mIsInterestedPackage=false 08-24 23:08:17.114 6907 6907 I art : at void com.google.mediapipe.apps.hairsegmentationgpu.MainActivity.onCreate(android.os.Bundle) (MainActivity.java:87) 08-24 23:08:17.115 6907 6907 I art : Caused by: java.lang.ClassNotFoundException: Didn't find class "android.view.View$OnUnhandledKeyEventListener" on path: DexPathList[[zip file "/data/app/com.google.mediapipe.apps.hairsegmentationgpu-1/base.apk"],nativeLibraryDirectories=[/data/app/com.google.mediapipe.apps.hairsegmentationgpu-1/lib/arm64, /data/app/com.google.mediapipe.apps.hairsegmentationgpu-1/base.apk!/lib/arm64-v8a, /system/lib64, /vendor/lib64]] 08-24 23:08:17.115 6907 6907 I art : at void com.google.mediapipe.apps.hairsegmentationgpu.MainActivity.onCreate(android.os.Bundle) (MainActivity.java:87) 08-24 23:08:17.349 729 1591 I CameraService: CameraService::connect call (PID -1 "com.google.mediapipe.apps.hairsegmentationgpu", camera ID 1) for HAL version default and Camera API version 1 08-24 23:08:17.467 1555 1576 I ActivityManager: Displayed com.google.mediapipe.apps.hairsegmentationgpu/.MainActivity: +1s733ms 08-24 23:08:17.467 1555 1576 I Timeline: Timeline: Activity_windows_visible id: ActivityRecord{4c14eb8 u0 com.google.mediapipe.apps.hairsegmentationgpu/.MainActivity t49739} time:148515 08-24 23:08:57.722 1555 1570 W ActivityManager: Activity pause timeout for ActivityRecord{4c14eb8 u0 com.google.mediapipe.apps.hairsegmentationgpu/.MainActivity t49739 f} 08-24 23:09:00.648 2672 2672 I Timeline: Timeline: Activity_launch_request time:191696 intent:Intent { act=android.intent.action.MAIN cat=[android.intent.category.LAUNCHER] flg=0x10200000 cmp=com.google.mediapipe.apps.hairsegmentationgpu/.MainActivity bnds=[238,603][439,935] } 08-24 23:09:08.209 1555 1570 W ActivityManager: Activity destroy timeout for ActivityRecord{4c14eb8 u0 com.google.mediapipe.apps.hairsegmentationgpu/.MainActivity t49739 f} 08-24 23:10:01.520 1555 1570 W ActivityManager: Activity pause timeout for ActivityRecord{7a6db9e u0 com.google.mediapipe.apps.hairsegmentationgpu/.MainActivity t49740} 08-24 23:10:11.522 1555 1570 W ActivityManager: Activity stop timeout for ActivityRecord{7a6db9e u0 com.google.mediapipe.apps.hairsegmentationgpu/.MainActivity t49740} 08-24 23:10:15.131 1555 2059 D ProcessManager: remove task: TaskRecord{aecdf46 #49740 A=com.google.mediapipe.apps.hairsegmentationgpu U=0 StackId=1 sz=1} 08-24 23:10:41.070 2672 2672 I Timeline: Timeline: Activity_launch_request time:292117 intent:Intent { act=android.intent.action.MAIN cat=[android.intent.category.LAUNCHER] flg=0x10200000 cmp=com.google.mediapipe.apps.hairsegmentationgpu/.MainActivity bnds=[238,1569][439,1901] } 08-24 23:10:41.361 7682 7682 D AccessibilityManager: current package=com.google.mediapipe.apps.hairsegmentationgpu, accessibility manager mIsFinalEnabled=false, mOptimizeEnabled=false, mIsUiAutomationEnabled=false, mIsInterestedPackage=false 08-24 23:10:41.384 7682 7682 I art : at void com.google.mediapipe.apps.hairsegmentationgpu.MainActivity.onCreate(android.os.Bundle) (MainActivity.java:87) 08-24 23:10:41.385 7682 7682 I art : Caused by: java.lang.ClassNotFoundException: Didn't find class "android.view.View$OnUnhandledKeyEventListener" on path: DexPathList[[zip file "/data/app/com.google.mediapipe.apps.hairsegmentationgpu-1/base.apk"],nativeLibraryDirectories=[/data/app/com.google.mediapipe.apps.hairsegmentationgpu-1/lib/arm64, /data/app/com.google.mediapipe.apps.hairsegmentationgpu-1/base.apk!/lib/arm64-v8a, /system/lib64, /vendor/lib64]] 08-24 23:10:41.385 7682 7682 I art : at void com.google.mediapipe.apps.hairsegmentationgpu.MainActivity.onCreate(android.os.Bundle) (MainActivity.java:87) 08-24 23:10:41.597 729 1591 I CameraService: CameraService::connect call (PID -1 "com.google.mediapipe.apps.hairsegmentationgpu", camera ID 1) for HAL version default and Camera API version 1 08-24 23:10:41.710 1555 1576 I ActivityManager: Displayed com.google.mediapipe.apps.hairsegmentationgpu/.MainActivity: +619ms 08-24 23:10:41.710 1555 1576 I Timeline: Timeline: Activity_windows_visible id: ActivityRecord{c5822d0 u0 com.google.mediapipe.apps.hairsegmentationgpu/.MainActivity t49742} time:292758

I also tried following the letterbox padding example. But the app crashed with those modifications. Here is the gist with code modification and log.

from mediapipe.

mcclanahoochie avatar mcclanahoochie commented on April 27, 2024

I don't see any error in the logcat posted

please try this logcat command instead:

adb logcat -s native:* tflite:* DEBUG:* Adreno:* MainActivity:* AndroidRuntime:* WindowManager:* ExternalTextureConv:* FrameProcessor:*

that^ it what i use for debugging

but, is this error still relevant?
https://gist.githubusercontent.com/sandipan1/b724405cfab7eee09df4ba4027c55662/raw/846d0d7df22ba0b6e8ce73a23ef7ec089fc898e8/logcat
i think the make_array() function either needs to be declared outside the class or add ColorSliderCalculator:: class prefix to definition.

I can maybe have a more detailed look later today or tomorrow

from mediapipe.

sandipan1 avatar sandipan1 commented on April 27, 2024

Logcat outputs with this commit

08-27 13:10:13.652 7427 7427 I Adreno : QUALCOMM build : 2df12b3, I07da2d9908 08-27 13:10:13.652 7427 7427 I Adreno : Build Date : 10/04/18 08-27 13:10:13.652 7427 7427 I Adreno : OpenGL ES Shader Compiler Version: EV031.25.03.01 08-27 13:10:13.652 7427 7427 I Adreno : Local Branch : 08-27 13:10:13.652 7427 7427 I Adreno : Remote Branch : 08-27 13:10:13.652 7427 7427 I Adreno : Remote Branch : 08-27 13:10:13.652 7427 7427 I Adreno : Reconstruct Branch : 08-27 13:10:13.652 7427 7427 I Adreno : Build Config : S L 6.0.7 AArch64 08-27 13:10:13.656 7427 7427 I Adreno : PFP: 0x005ff087, ME: 0x005ff063
08-27 13:10:14.137 1968 9220 I WindowManager: WIN DEATH: Window{ffbc86a u0 com.android.launcher3/com.android.a1launcher.AndroidOneLauncher}

08-27 12:54:30.661 2931 2950 D ExternalTextureConv: Created output texture: 2 width: 1080 height: 1440
08-27 12:54:30.695 2931 2950 D ExternalTextureConv: Created output texture: 5 width: 1080 height: 1440
08-27 12:54:30.751 2931 2949 I tflite : Initialized TensorFlow Lite runtime.
08-27 12:54:30.752 2931 2949 I tflite : Created TensorFlow Lite delegate for GPU.
08-27 12:54:44.556 1968 2024 W WindowManager: Unable to start animation, surface is null or no children.
08-27 12:54:54.029 1968 1968 W WindowManager: removeWindowToken: Attempted to remove non-existing token: android.os.Binder@e188363

The last 2 lines (Unable to start animation and removeWindowToken) occur when I move one of the sliders

from mediapipe.

gitunit avatar gitunit commented on April 27, 2024

to make the black screen away, you need to publish the packets of RGB at least once outside of the callback, for example at the end of onCreate().
the ColorSliderCalculator is not really needed. you could publish an array directly or just work with 3 inputs (what i actually did).
the actual problem here is retrieving the packets in the recolor_calculator. it seems that those packages never arrive. i checked with the following:

if (cc->Inputs().HasTag("COLOR_R") && !cc->Inputs().Tag("COLOR_R").IsEmpty()) {
const Packet& r_packet = cc->Inputs().Tag("COLOR_R").Value();
const auto& r_buffer = r_packet.Get< int >();
}

this way, everything is fine, but the color is not changing.
however, when i remove the check with HasTag and IsEmpty, then the app will crash with this error:
2019-08-27 11:28:11.710 19931-19931/? A/DEBUG: pid: 19869, tid: 19887, name: mediapipe_gl_ru >>> com.google.mediapipe.apps.hairsegmentationgpu <<< 2019-08-27 11:28:11.710 19931-19931/? A/DEBUG: signal 6 (SIGABRT), code -6 (SI_TKILL), fault addr --------
this error doesn occur with this line only:

const Packet& r_packet = cc->Inputs().Tag("COLOR_R").Value();

in fact, it is this following line that leads to the error:

const auto& r_buffer = r_packet.Get< int >()

i've tried different types of packets but it is always the same error.

this makes me think, that somehow the packets never arrive, right?

from mediapipe.

gitunit avatar gitunit commented on April 27, 2024

i forgot to mention that you should use System.currentTimeMillis() instead of packet.getTimestamp() inside the RGBHandler, because packet.getTimestamp() will output some invalid negative number.

from mediapipe.

sandipan1 avatar sandipan1 commented on April 27, 2024

@gitunit I am the getting the same error as you when I send the packets at the end of onCreate()

from mediapipe.

gitunit avatar gitunit commented on April 27, 2024

@sandipan1 so the black screen is gone for you as well?
it looks like the packets are empty inside the recolor_calculator. im not sure why.

from mediapipe.

gitunit avatar gitunit commented on April 27, 2024

finally! i figured it out. the timestamp of the rgb packets was way ahead of the image, thus there were never packets ready to be synced with the images. you can debug those timestamp inside of Graph.java in the addConsumablePacketToInputStream method.

from mediapipe.

sandipan1 avatar sandipan1 commented on April 27, 2024

@gitunit In order to sync with the frame , I have added frame.getTimeStamp(). Logcat is this

However the black screen returns.I am not sure how to debug inside Graph.java . Can you suggest how to solve?

from mediapipe.

gitunit avatar gitunit commented on April 27, 2024

easy, instead of the TextureFrameConsumer, you need to set the interface for OnWillAddFrameListener. this is an interface where you can get the timestamp before the image frame is being send.

from mediapipe.

sandipan1 avatar sandipan1 commented on April 27, 2024

Now the video frame is displaying. However, the segmented hair is inverted vertically and the hair color is only black. No color change occurs when I change the slider positions.I have tried changing setting processor.getVideoSurfaceOutput().setFlipY(FLIP_FRAMES_VERTICALLY) but it doesnt help
These are the changes

from mediapipe.

mcclanahoochie avatar mcclanahoochie commented on April 27, 2024

you can try removing the
flip_vertically: true
in the graph (or setting it to false)

if you are in sync with the v0.6 release, you shouldn't need it, and that may be why things are flipped

...

can you control the color if you manually set the r/g/b packets to hardcode a single color when calling addConsumablePacketToInputStream() ? (instead of the slider values)

from mediapipe.

mgyong avatar mgyong commented on April 27, 2024

@sandipan1 Any updates from your end? We will close this for now. Pls reopen if u have updates

from mediapipe.

SwatiModi avatar SwatiModi commented on April 27, 2024

I was trying to do this, as suggested in your latest comment, by setting
flip_vertically: true in the graph , I was able to fix the invert issue.

Also about the Color Slider, It does not work but when I set the r/g/b packets to hardcode a single color when calling addConsumablePacketToInputStream(). How to fix this ?

from mediapipe.

pulpoec avatar pulpoec commented on April 27, 2024

Hello
I Try to replicate all the steps to run this example, but I have some complications.

What I do?

On the logcat I have this

12-07 10:00:25.112 1468 1553 W WindowManager: preserveSurfaceLocked: failed, no surface, w=Window{65604df u0 com.google.mediapipe.apps.hairsegmentationgpu/com.google.mediapipe.apps.hairsegmentationgpu.MainActivity}

And when scroll the color slide have this on the logcat
12-07 10:18:17.610 1468 1616 W WindowManager: Unable to start animation, surface is null or no children. 12-07 10:18:27.093 1468 1468 W WindowManager: removeWindowToken: Attempted to remove non-existing token: android.os.Binder@d1fad5d

The custom mediapipe git is here

How can I resolve this ?

thanks a lot!

from mediapipe.

akokhlik avatar akokhlik commented on April 27, 2024

Hello
I Try to replicate all the steps to run this example, but I have some complications.

What I do?

* I clone the MediaPipe repository

* I change the files that @SwatiModi recommended [here](https://github.com/google/mediapipe/issues/420#issuecomment-694376175)
  
  * [ColorSliderCalculator.cc](https://github.com/pulpoec/mediapipe_custom/blob/new_branch/mediapipe/calculators/image/ColorSliderCalculator.cc)
  * [recolor_calculator.cc](https://github.com/pulpoec/mediapipe_custom/blob/new_branch/mediapipe/calculators/image/recolor_calculator.cc)
  * [MainActivity.java](https://github.com/pulpoec/mediapipe_custom/blob/new_branch/mediapipe/examples/android/src/java/com/google/mediapipe/apps/hairsegmentationgpu/MainActivity.java)
  * [hair_segmentation_mobile_gpu.pbtxt](https://github.com/pulpoec/mediapipe_custom/blob/new_branch/mediapipe/graphs/hair_segmentation/hair_segmentation_mobile_gpu.pbtxt)

* Build the app without errors

* the app run but I have the black screen

On the logcat I have this

12-07 10:00:25.112 1468 1553 W WindowManager: preserveSurfaceLocked: failed, no surface, w=Window{65604df u0 com.google.mediapipe.apps.hairsegmentationgpu/com.google.mediapipe.apps.hairsegmentationgpu.MainActivity}

And when scroll the color slide have this on the logcat
12-07 10:18:17.610 1468 1616 W WindowManager: Unable to start animation, surface is null or no children. 12-07 10:18:27.093 1468 1468 W WindowManager: removeWindowToken: Attempted to remove non-existing token: android.os.Binder@d1fad5d

The custom mediapipe git is here

How can I resolve this ?

thanks a lot!

Hi i have the same black screen problem. How did you solve this problem?

from mediapipe.

khmaies5 avatar khmaies5 commented on April 27, 2024

anyone with working solution with the new version of mediapipe?
i got this errors after trying what is suggested here check this comment

from mediapipe.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.