Question about timestamps and tagging images

  • This post is rather long, but it is easy to follow.


    0)the context

    1)I observe a problem that I thought should not occur

    2)I try to understand the scenario leading to the observation

    3)I try to find a workaround, either by a small fix, or by totally reconsidering the design, I don't know yet : it depends on 2)


    0)First, let me explain in a few lines the purpose of the whole thing :

    I have two cameras, each one connected to a frame grabber (VQ8-CXP6D) in a single host PC. The two frame grabbers share an OptoTrigger. I have a custom (but very simple) applet running in the frame grabbers (see attachement). The cameras are synchronized by the same external signal generator (~1000fps) (plugged on Pin0 of the applet). An external TTL can also be raised at any time towards the Pin1 of the applet (then redirected as an "EventToHost"). The host PC catch the event, and regarding the timestamp of the event, will "mark" the matching frame as "tagged". There is no performance problem, no frame lost, i use a very efficient Apc frame grabbing with the SDK.


    1)Now, the problem

    The idea is that the TTL should tag images that were taken at the same exact time in the real world. But I can see that it is not the case. If I record a sequence (well, 1 sequence per camera) where a lightbulb is observed, both sequence will (as expected) have a tagged image, (that can be artificially considered time T0), but while camera C1 will see the lightbulb lighting up at frame T1, the second camera C2 can see the light bulb lighting up at frame T2. I have seen T2=T1+dT with dT range from 0 to 10 frames, while I expected T1=T2 everytime.


    2)I tried to understand how it could happen.

    Regarding the applet design (see attachement), I assume that the events on Pin1 are always delivered before the images. When I receive the event in my callback, I look at the timestamp (which is BTW not in fg_event_info.timestamp[0] but in fg_event_info.timestamp[1], don't know why, but that's not the problem), and I push that timestamp in an "events-timestamps-queue" (one queue per frame grabber). In the frame grabbing APC, when a frame is received, I look at the frame timestamp; if the "events-timestamps-queue" (of the same FG) is not empty and the frame timestamp is greater than the first event timestamp, I "tag" the image and pop the first event timestamp from its queue. I really thought that was perfect.

    First question : are the timestamps reliable ? On the doc, I could finally find the little mention "It is a high-performance query counter and related to Microsoft's query performance counter for Windows®." Does that mean that the timestamp is not generated on the FG but in the Host PC ? What does it imply in my use case where I want to catch the event when it occurs (TTL on Pin1), but not when it is received by the SDK (that occurs, as far as I understand "some time" after being received, since it must also be transmitted to the host PC and prepared to be sent to the registered callbacks) ?


    3)What kind of workaround can I imagine ?

    Tagging images is not an easy task. I really want to avoid modifying the image pixels to store information, or even worse (for me), adding data at the end of the image. Even if I wanted to do so, I was not able yet to produce an efficient VA design for that, because

    -if the even occurs during image I acquisition, it seems more reasonable to tag images I+1 and let the software handle that fixed shift

    -if we tag image I+1, it means that a signalCounter that could be used to take decisions should not be reset on a frame level, but should rather be stored in a register, the register being reset after being succefull inserted in the next image. Not that easy (for me) to implement in VA

    -the "FG_IMAGE_TAG" seems not to be designed for that at all, I am not sure how it could be used


    What kind of advices (or solutions) can you give me ?

  • Hello Pierre


    thanks for this interesting question and good explanations.


    Let me start with the bad news. You wrote:

    It is a high-performance query counter and related to Microsoft's query performance counter for Windows®." Does that mean that the timestamp is not generated on the FG but in the Host PC ?

    Indeed the timestamp is not generated by the FG. It is a timestamp by the host PC and not reliable to do any assumptions concerning the trigger pulses. In fact the times of receiving DMA image transfers and such events can totally get mixed up.


    So you need to generate a reliable timestamp inside the frame grabber and use this one. You mentioned several options already. I think the most simple one will be a large counter i.e. 64 bit counting at every clock cycle (operator PulseCounter @125MHz). Latch the counter value either with the image transfer (PixelToImage) by the camera or the GPI1 signal (RemovePixel of all counter values except when a trigger is present) and transfer the results either using a second DMA channel or as an image trailer.


    I hope that'll help you. Feel free to ask further more detailed questions.


    Johannes


    Johannes Trein
    Group Leader R&D
    frame grabber

    Basler AG



  • Quote

    Indeed the timestamp is not generated by the FG. It is a timestamp by the host PC and not reliable to do any assumptions concerning the trigger pulses. In fact the times of receiving DMA image transfers and such events can totally get mixed up.

    Right, this is indeed bad news for me because it is highly critical in my applications. I will have to reconsider many things.

    At least it is without ambiguity : you should really document that in capital letters.


    I am not sure to understand your proposal about latch counter : can you show me an example design ?


    Anyway, please note that if I cannot attach a timestamp to an event, then the timestamp information is not that useful. In that case, a single bit for "rising edge detected during image" and another bit for "falling edge detected during image" may be enough.


    And as you saw in my first post, I am not sure about how to handle that in the FG pipeline. It's not easy to explain what I have in mind, but let's try.


    -If I store an "event occured" bit in the image, I will miss it in case of frame loss, while "EventToHost" had the very advantage of being delivered anyway. It would be solved if I could attach information to the "EventToHost" channel.


    -If I use a register to store "event occured" and insert that in the first pixel after the "memory buffer", then I think there is a conflict, because a new event could occur during the insertion. So this new event should be associated to the next image while I am still marking the current one.


    -If I use a register to store "event occured" in the tail of the image (last pixel), is it better ? I really don't like footers, I woudl largely prefer headers.


    -If I use a register to store "event occured in image N" in the first pixel of image N+1, I don't find either how to handle the reset properly.


    -An awful solution (for me) would be to add another DmaToPC in order to send bytes instead of "event" in case of tag signal. Those bytes could be the FG counter used as a timestamp. But it is a huge refactoring of my acquisition process, and in some applets I already use several Dmas to handle images, so I would have to add a custom option to let the user select himself if a "dma port" is regular or "to handle events". Booh.


    Any help is welcome.

  • Hello Pierre,


    Thank you for your detailed explanation of the requirements.


    -An awful solution (for me) would be to add another DmaToPC in order to send bytes instead of "event" in case of tag signal. ...


    Any help is welcome.

    really want to avoid modifying the image pixels to store information, or even worse (for me), adding data at the end of the image.

    From my point of view adding a header or footer to a DMA Transfer is a possible solution,

    but since you do not like/prefer this approach I would like to make the following proposal:


    There is a simple and reliable way of getting FG related timestamps:


    You can use the ImageMonitor operator...

    FGevent_Time.png


    Above you can see an example of event generation and frame grabber (FG) time stamp transfer.
    When you register a software callback so that you will have a function getting called when the edge on the trigger input occurs.


    A very precise FG timestamp is sampled for each trigger here and waits in FIFO until ImageMonitor (link to docu) reads it. You can add more data blocks (image-number, counters, ...) and use several of these in the design. This is named "Image"Monitor but here we simply use it for data "structure".


    The event data (FG time stamp) can be read through register/ImageMonitor interface. If you want to have some sample C++ code for that feel free to contact me.
    FGevent_Clocker.png


    Above you can see how a FG time can be generated using 64bit... You can modify the data-type to your personal preference.
    Current FG Time is shown by GetStatus to Register interface.


    VA DESIGN: FG_TimeStamps_TimeDataForEvent.va


    In case of questions do not hesitate to contact me.


    Best regards,
    Björn

  • Hi,

    I appreciate you trying to deal with my reluctance to change the image stream.

    I have never used the "ImageMonitor", I don't even know how I can read it in the SDK. Where is the doc on the SDK side ?


    Your proposal is interesting, but it only generates timestamps for TTL events, and I won't be able to match them with images. But I think it can be improved. Tell me if I am wrong but, could I instead gather in the image monitor :

    -the current 64b value of the FG counter (pseudo-timestamp), which would be updated at each image

    -the current 64b image number (which will be, I hope, the same ImgNr as in the apc callback (frameindex_t imgNr, struct fg_apc_data* data)

    -the 64b value of the FG counter when the last TTL event occured. This value would be updated only on TTL event


    This way, I could

    -read the Monitor FIFO from my apc callback, so that I will always be able to get reliable information and will support frame loss easily

    -instead of reading "FG_TIMESTAMP_LONG", I could use the 64b pseudo-timestamp from the monitor

    -detect that an event occured without relying on the EventToHost at all


    Is it correct ?

    Would it impact performances a lot ?


    Additional questions

    -why do you mention the clock being ns/10 and the output being 10*ns ?

    -why do you cast the 1x64b to 2x32b before sending to the monitor ?

  • Hello Pierre,


    Concerning your questions:


    Where is the doc on the SDK side ?

    • I can provide some code for this and adopt the data-format together with you, based on what we design.

    This way, I could

    -read the Monitor FIFO from my apc callback, so that I will always be able to get reliable information and will support frame loss easily

    -instead of reading "FG_TIMESTAMP_LONG", I could use the 64b pseudo-timestamp from the monitor

    -detect that an event occured without relying on the EventToHost at all

    Is it correct ?

    • you can retrieve the event-related dataset (FG_Time and the things we add ...)
    • you can read the "current FG_Time" from the framegrabber when you receive the APC.

    Would it impact performances a lot ?

    • This will not affect the performance of applet of software (SW) in practice. While SW needs to read those values, but register reads are really fast: I would estimate a response/read time of ~300µs.
    • The amount of required FPGA ressources is minimalistic.

    We should go into a VA coaching session of ~1h and we will find a nice solution for your request. We can add a lot of additional details:

    • Trigger/Image Count difference within VA
    • Image receive start time
    • DMA transfer start
    • Generate Trigger-Event for rising/falling edge


    Best regards,

  • Dear Pierre,


    It would be simpler...

    Could you post a design for that ?

    Very simlar approach using CreateBlankImage + H-Box FG_Time, but then appending it at the end of the next frame by InsertImage.
    Switching in between of different cases is required.
    In case of trigger received the InsertImage should forward, in all other cases not or mark it correspondingly.


    Best regards,