Giter Site home page Giter Site logo

usdot-jpo-ode / asn1_codec Goto Github PK

View Code? Open in Web Editor NEW
13.0 13.0 18.0 6.78 MB

Module to encode and decode ASN.1 streams of messages using Kafka messaging hub for communication with the data source and data destination in a pub/sub scheme.

License: Apache License 2.0

CMake 0.40% Shell 0.71% C++ 96.73% Batchfile 0.02% C 1.87% Python 0.07% Dockerfile 0.20%

asn1_codec's People

Contributors

aferber avatar codygarver avatar dan-du-car avatar dmccoystephenson avatar drewjj avatar hmusavi avatar iyourshaw avatar jmcarter9t avatar mvs5465 avatar mwodahl avatar paynebrandon avatar saikrishnabairamoni avatar tonychen091 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

asn1_codec's Issues

Support using asn1_codec as a decoder in command-line mode

The Conflict Visualizer uses asn1_codec as a decoder with file input and receives standard output. For this to work, the ASN1_Codec::filetest member function needs to produce decoded output to stdout. We were unable to get this to work by changing the logging configuration, and ideally this should work in a robust way regardless of the logging config, so we propose simply sending the output of this function to stdout. This should not interfere with normal operation that reads from kafka topics that does not use the filetest function.

Build issue on dynamic-types branch

We are attempting to use the branch dynamic-types for encoding specific parts of an XML message, but are unable to successfully compile the branch.

The Docker build process gets to step 24 and then throws the following error message:

Step 24/26 : RUN cd /asn1_codec && mkdir -p /build && cd /build && cmake /asn1_codec && make
 ---> Running in 55c80b69e1eb
-- The C compiler identification is GNU 4.9.3
-- The CXX compiler identification is GNU 4.9.3
-- Check for working C compiler: /usr/bin/gcc
-- Check for working C compiler: /usr/bin/gcc -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Detecting C compile features
-- Detecting C compile features - done
-- Check for working CXX compiler: /usr/bin/g++
-- Check for working CXX compiler: /usr/bin/g++ -- works
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Configuring done
-- Generating done
-- Build files have been written to: /build
/usr/local/bin/cmake -H/asn1_codec -B/build --check-build-system CMakeFiles/Makefile.cmake 0
/usr/local/bin/cmake -E cmake_progress_start /build/CMakeFiles /build/CMakeFiles/progress.marks
make -f CMakeFiles/Makefile2 all
make[1]: Entering directory '/build'
make -f CMakeFiles/acm_tests.dir/build.make CMakeFiles/acm_tests.dir/depend
make[2]: Entering directory '/build'
cd /build && /usr/local/bin/cmake -E cmake_depends "Unix Makefiles" /asn1_codec /asn1_codec /build /build /build/CMakeFiles/acm_tests.dir/DependInfo.cmake --color=
Scanning dependencies of target acm_tests
make[2]: Leaving directory '/build'
make -f CMakeFiles/acm_tests.dir/build.make CMakeFiles/acm_tests.dir/build
make[2]: Entering directory '/build'
[  7%] Building CXX object CMakeFiles/acm_tests.dir/src/tests.o
/usr/bin/g++   -D_ASN1_CODEC_TESTS -I/asn1_codec/include -I/asn1_codec/include/catch -I/asn1_codec/include/rapidjson -I/asn1_codec/include/spdlog -I/asn1_codec/asn1c/skeletons -I/asn1_codec/asn1c_combined -I/usr/local/include -I/usr/local/include/librdkafka  -O3 --coverage -DPDU=MessageFrame   -std=gnu++11 -o CMakeFiles/acm_tests.dir/src/tests.o -c /asn1_codec/src/tests.cpp
[ 15%] Building CXX object CMakeFiles/acm_tests.dir/src/acm.o
/usr/bin/g++   -D_ASN1_CODEC_TESTS -I/asn1_codec/include -I/asn1_codec/include/catch -I/asn1_codec/include/rapidjson -I/asn1_codec/include/spdlog -I/asn1_codec/asn1c/skeletons -I/asn1_codec/asn1c_combined -I/usr/local/include -I/usr/local/include/librdkafka  -O3 --coverage -DPDU=MessageFrame   -std=gnu++11 -o CMakeFiles/acm_tests.dir/src/acm.o -c /asn1_codec/src/acm.cpp
/asn1_codec/src/acm.cpp: In constructor 'ASN1_Codec::ASN1_Codec(const string&, const string&)':
/asn1_codec/src/acm.cpp:207:28: error: 'make_asn_name_type_map' was not declared in this scope
     make_asn_name_type_map();
                            ^
/asn1_codec/src/acm.cpp: In member function 'bool ASN1_Codec::process_message(RdKafka::Message*, std::stringstream&)':
/asn1_codec/src/acm.cpp:960:54: error: 'set_decoding_requirements' was not declared in this scope
                 set_decoding_requirements( input_doc );
                                                      ^
/asn1_codec/src/acm.cpp:971:54: error: 'set_encoding_requirements' was not declared in this scope
                 set_encoding_requirements( input_doc );
                                                      ^
/asn1_codec/src/acm.cpp: In member function 'void ASN1_Codec::encode_frame_data(const string&, std::string&)':
/asn1_codec/src/acm.cpp:1385:46: error: invalid conversion from 'uint32_t {aka unsigned int}' to 'asn_TYPE_descriptor_s*' [-fpermissive]
  struct asn_TYPE_descriptor_s* data_struct = curr_op_;
                                              ^
/asn1_codec/src/acm.cpp: At global scope:
/asn1_codec/src/acm.cpp:1446:69: error: no 'bool ASN1_Codec::set_encoding_requirements(pugi::xml_document&)' member function declared in class 'ASN1_Codec'
 bool ASN1_Codec::set_encoding_requirements( pugi::xml_document& doc ) {
                                                                     ^
/asn1_codec/src/acm.cpp:1526:69: error: no 'bool ASN1_Codec::set_decoding_requirements(pugi::xml_document&)' member function declared in class 'ASN1_Codec'
 bool ASN1_Codec::set_decoding_requirements( pugi::xml_document& doc ) {
                                                                     ^
/asn1_codec/src/acm.cpp: In member function 'bool ASN1_Codec::file_test(std::string, std::ostream&, bool)':
/asn1_codec/src/acm.cpp:1627:54: error: 'set_decoding_requirements' was not declared in this scope
                 set_decoding_requirements( input_doc );
                                                      ^
/asn1_codec/src/acm.cpp:1638:54: error: 'set_encoding_requirements' was not declared in this scope
                 set_encoding_requirements( input_doc );
                                                      ^
/asn1_codec/src/acm.cpp: At global scope:
/asn1_codec/src/acm.cpp:1685:41: error: no 'void ASN1_Codec::make_asn_name_type_map()' member function declared in class 'ASN1_Codec'
 void ASN1_Codec::make_asn_name_type_map() {
                                         ^
/asn1_codec/src/acm.cpp: In member function 'bool ASN1_Codec::filetest()':
/asn1_codec/src/acm.cpp:1762:54: error: 'set_decoding_requirements' was not declared in this scope
                 set_decoding_requirements( input_doc );
                                                      ^
/asn1_codec/src/acm.cpp:1773:54: error: 'set_encoding_requirements' was not declared in this scope
                 set_encoding_requirements( input_doc );
                                                      ^
CMakeFiles/acm_tests.dir/build.make:89: recipe for target 'CMakeFiles/acm_tests.dir/src/acm.o' failed
make[2]: Leaving directory '/build'
make[2]: *** [CMakeFiles/acm_tests.dir/src/acm.o] Error 1
make[1]: *** [CMakeFiles/acm_tests.dir/all] Error 2
CMakeFiles/Makefile2:70: recipe for target 'CMakeFiles/acm_tests.dir/all' failed
make[1]: Leaving directory '/build'
make: *** [all] Error 2
Makefile:86: recipe for target 'all' failed
ERROR: Service 'adm' failed to build: The command '/bin/sh -c cd /asn1_codec && mkdir -p /build && cd /build && cmake /asn1_codec && make' returned a non-zero code: 2

First message published to a topic after ADM/AEM startup does not get decoded/encoded

This was reported by WyDOT and ODE team created a ticket for it #224. The first message that gets sent to the codec module does not get encoded/decoded, requiring it to be resent. Here's what the log.info file looks like when two messages are sent, only the second one gets read:

[180209 16:47:31.014797] [trace] Waiting for needed consumer topic: topic.Asn1EncoderInput.
[180209 16:47:31.022779] [warning] Metadata did not contain topic: topic.Asn1EncoderInput.
[180209 16:47:32.523523] [trace] Waiting for needed consumer topic: topic.Asn1EncoderInput.
[180209 16:47:32.527458] [warning] Metadata did not contain topic: topic.Asn1EncoderInput.
[180209 16:47:34.027599] [trace] Waiting for needed consumer topic: topic.Asn1EncoderInput.
[180209 16:47:34.031222] [warning] Metadata did not contain topic: topic.Asn1EncoderInput.
[180209 16:47:35.531693] [trace] Waiting for needed consumer topic: topic.Asn1EncoderInput.
[180209 16:47:35.537131] [info] Topic: topic.Asn1EncoderInput found in the kafka metadata.
[180209 16:47:35.537147] [trace] Consumer topic: topic.Asn1EncoderInput is available.
[180209 16:47:35.537276] [info] Consumer: rdkafka#consumer-1 created using topics: topic.Asn1EncoderInput.
[180209 16:47:35.537495] [info] Producer: rdkafka#producer-2 created using topic: topic.Asn1EncoderOutput.
[180209 16:47:38.693718] [trace] process_message(): starting...
[180209 16:47:38.693733] [info] ODE BSM consumer partition end of file, but ASN1_Codec still alive.
[180209 16:47:38.693738] [trace] process_message(): finished...
[180209 16:47:43.693683] [trace] process_message(): starting...
[180209 16:47:43.693736] [info] process_message(): Waiting for more BSMs from the ODE producer.
[180209 16:47:43.693765] [trace] process_message(): finished...
[180209 16:47:47.703694] [trace] process_message(): starting...
[180209 16:47:47.703721] [trace] process_message(): Read message at byte offset: 1 with length 2917
[180209 16:47:47.703745] [trace] process_message(): Message timestamp: create time, type: 1518194867677
[180209 16:47:47.718388] [trace] run(): successful encoding/decoding
[180209 16:47:47.810927] [trace] process_message(): starting...
[180209 16:47:47.810950] [info] ODE BSM consumer partition end of file, but ASN1_Codec still alive.
[180209 16:47:47.810958] [trace] process_message(): finished...

Seems that the consumers do not get created until the topics do, and the ODE creates topics dynamically as messages get sent through.

Codec stops on Kafka timeouts

Summary

WyDOT reported that during a heavy file upload period, the ADM container shut down. We believe that either Zookeeper or Kafka got stuck for approximately 4 minutes, during which it did not respond to producer or consumer requests. This lack of response then caused the ADM to shut down. Since 4 minutes later Kafka/Zookeeper resumed working, it seems that it would be better for the ADM to pause rather than shut down.

We are still working to determine the cause and possible solutions for why Kafka went down. Since there were no logs for 4 minutes it seems that either the broker crashed and automatically restarted or there was some sort of network outage. The ODE reported network connectivity errors at the same time, reinforcing the network outage theory.

Debugging

The log.error file contains many entries with this message:

[180110 21:12:52.425121] [error] run(): Failure of XER encoding: Local: Queue full

This seems to come the switch statement in this file. There is a timeout condition:

case RdKafka::ERR__TIMED_OUT:
            ilogger->info("{}: Waiting for more BSMs from the ODE producer.", fnname );

But Kafka can throw two types of timeout errors:

RD_KAFKA_RESP_ERR__MSG_TIMED_OUT Produced message timed out
RD_KAFKA_RESP_ERR__TIMED_OUT Operation timed out

So I believe that Kafka shut down and the producer threw the ERR__MSG_TIMED_OUT after failing to get a response, which went into the "default" case of the above switch statement, which sets "data_available" to false and ends the processing run.

Logging Update

The asn1_codec module currently uses a combination of logging, having separate files for error and info level logging as well as still logging manually to the std::cerr and std::cout. This should be consolidated to a single stream of logging, sending data to the standard console outputs for ease of consumption by Docker/K8s. Additionally, the module should allow for overriding the logging level (configuration file setting) to prevent logging size issues down the road.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.