openschc / openschc Goto Github PK
View Code? Open in Web Editor NEWSCHC Implementation (for Python): see doc at
Home Page: https://openschc.github.io/openschc
License: MIT License
SCHC Implementation (for Python): see doc at
Home Page: https://openschc.github.io/openschc
License: MIT License
if FCN size > 1 the noACK must number all the frames
There is a bug in CoAP unparse especially when UP and DW fields are present. It appears that the CoAP reconstruction was driven by the rule and not the header fields. ltn22/openschc contains in connector branch a new version of the unparse. What will be missing is to sort the option fields regarding their number and position, if the rule do not respect it.
Add an option in parse to indicate the CoAP port number. By default 5683
Verify the code returned by schc_send. Should be None when no error occurs, False if no rules are found.
HI,
I just started to test examples/tcp and i got some errors. Could u please guide me?
after running Server on terminal 1 with the command "python3 ClientServerSimul.py --role server --compression false --rule ../configs/rule1.json", I got the following:
socket binded to 127.0.0.1 ip and port 12345
socket is listening
Connexion de 127.0.0.1 34362
then i ran client on another terminal: "python3 ClientServerSimul.py --role client --compression false --rule ../configs/rule1.json --time 20 --payload payload/testfile_small.txt"
Then for both terminal i got the same errors:
--------------------------- Iteration 1 --------------------------
---------- Client: Ip = 127.0.0.1 Port = 34362 ---------------
Traceback (most recent call last):
File "ClientServerSimul.py", line 100, in
schcConfig.start()
File "ClientServerSimul.py", line 70, in start
self.server.server()
File "/home/samira/openschc/examples/tcp/ServerConnection.py", line 48, in server
self.newThread = ClientThread.ClientThread(ipClient, portClient, self.clientSocket, self.configuration)
File "/home/samira/openschc/examples/tcp/ClientThread.py", line 28, in init
self.client_config()
File "/home/samira/openschc/examples/tcp/ClientThread.py", line 98, in client_config
self.clientConfigInServer.configSim()
File "/home/samira/openschc/examples/tcp/SchcConfig.py", line 68, in configSim
self.node0 = self.make_node(self.sim, self.rule_manager, self.devaddr)
File "/home/samira/openschc/examples/tcp/SchcConfig.py", line 85, in make_node
node = net_sim_core.SimulSCHCNode(sim, extra_config)
File "../../src/net_sim_core.py", line 89, in init
self.config, self, self.layer2, self.layer3, role, unique_peer)
File "../../src/protocol.py", line 124, in init
assert role in ["device", "core-server"]
AssertionError
Make the test fails when the compression rule is not found.
We need to have all tests written using the pytest
so they can be run by GitHub Actions.
I see two locations for tests:
test/
directory. The README
suggest one needs to run the scripts directly but test_frag.py
suggests pytest
is usedtest_*.py
script at the root, which I believe should be run directly.My approach would to migrate all tests into the tests/
directory, using the pytest
approach.
I can do the heavy-lifting, but @MarinoMtz, please indicate which tests we expect to run.
The rule of the sender is:
{'RuleID': 1, 'RuleIDLength': 3, 'Fragmentation': {'FRDirection': 'DW', 'FRMode': 'ackOnError', 'FRModeProfile': {'FCNSize': 3, 'dtagSize': 2, 'MICALgorithm': 'crc32', 'WSize': 7, 'ackBehavior': 'afterAll1', 'tileSize': 17, 'maxRetry': 4, 'timeout': 600, 'L2WordSize': 8, 'lastTileInAll1': False, 'windowSize': 7}}}
Please look at the content of tile. The last two tiles the receiver received are different from the sender sent. So, MIC error happens at the receiver side.
It looks that the order is reversed. If you consider it, the content is still different.
Any idea ?
I am going to make a kit to reproduce this error.
{'w-num': 0, 't-num': 6, 'tile': b'\x60\x05\x80'/17, 'sent': True
{'w-num': 0, 't-num': 5, 'tile': b'\x73\x64\x00'/17, 'sent': True
{'w-num': 0, 't-num': 4, 'tile': b'\x00\x40\x80'/17, 'sent': True
{'w-num': 0, 't-num': 3, 'tile': b'\xd2\x01\x00'/17, 'sent': True
{'w-num': 0, 't-num': 2, 'tile': b'\x00\x10\x00'/17, 'sent': True
{'w-num': 0, 't-num': 1, 'tile': b'\x84\x18\x00'/17, 'sent': True
{'w-num': 0, 't-num': 0, 'tile': b'\x37\x04\x00'/17, 'sent': True
{'w-num': 1, 't-num': 6, 'tile': b'\x01\x00\x00'/17, 'sent': True
{'w-num': 1, 't-num': 5, 'tile': b'\x00\x00\x00'/17, 'sent': True
{'w-num': 1, 't-num': 4, 'tile': b'\x00\x00\x00'/17, 'sent': True
{'w-num': 1, 't-num': 3, 'tile': b'\x00\x01\x80'/17, 'sent': True
{'w-num': 1, 't-num': 2, 'tile': b'\xc9\x20\x00'/17, 'sent': True
{'w-num': 1, 't-num': 1, 'tile': b'\x46\x80\x00'/17, 'sent': True
{'w-num': 1, 't-num': 0, 'tile': b'\x08\x00\x80'/17, 'sent': True
{'w-num': 2, 't-num': 6, 'tile': b'\x02\x06\x00'/17, 'sent': True
{'w-num': 2, 't-num': 5, 'tile': b'\x00\x00\x00'/17, 'sent': True
{'w-num': 2, 't-num': 4, 'tile': b'\x00\x00\x00'/17, 'sent': True
{'w-num': 2, 't-num': 3, 'tile': b'\x00\x01\x00'/17, 'sent': True
{'w-num': 2, 't-num': 2, 'tile': b'\xc8\xd2\x00'/17, 'sent': True
{'w-num': 2, 't-num': 1, 'tile': b'\x00\x07\x00'/17, 'sent': True
{'w-num': 2, 't-num': 0, 'tile': b'\x9d\xa4\x80'/17, 'sent': True
{'w-num': 3, 't-num': 6, 'tile': b'\x75\x20\x00'/17, 'sent': True
{'w-num': 3, 't-num': 5, 'tile': b'\x00\x17\x00'/17, 'sent': True
{'w-num': 3, 't-num': 4, 'tile': b'\x10\xf8\x80'/17, 'sent': True
{'w-num': 3, 't-num': 3, 'tile': b'\x56\x00\x00'/17, 'sent': True
{'w-num': 3, 't-num': 2, 'tile': b'\x18\x00'/9, 'sent': True
{'w-num': 3, 't-num': 1, 'tile': b'\x4c\x3c'/14, 'sent': True}]
{'w-num': 0, 't-num': 6, 'nb_tiles': 1, 'raw_tiles': b'\x60\x05\x80'/17}
{'w-num': 0, 't-num': 5, 'nb_tiles': 1, 'raw_tiles': b'\x73\x64\x00'/17}
{'w-num': 0, 't-num': 4, 'nb_tiles': 1, 'raw_tiles': b'\x00\x40\x80'/17}
{'w-num': 0, 't-num': 3, 'nb_tiles': 1, 'raw_tiles': b'\xd2\x01\x00'/17}
{'w-num': 0, 't-num': 2, 'nb_tiles': 1, 'raw_tiles': b'\x00\x10\x00'/17}
{'w-num': 0, 't-num': 1, 'nb_tiles': 1, 'raw_tiles': b'\x84\x18\x00'/17}
{'w-num': 0, 't-num': 0, 'nb_tiles': 1, 'raw_tiles': b'\x37\x04\x00'/17}
{'w-num': 1, 't-num': 6, 'nb_tiles': 1, 'raw_tiles': b'\x01\x00\x00'/17}
{'w-num': 1, 't-num': 5, 'nb_tiles': 1, 'raw_tiles': b'\x00\x00\x00'/17}
{'w-num': 1, 't-num': 4, 'nb_tiles': 1, 'raw_tiles': b'\x00\x00\x00'/17}
{'w-num': 1, 't-num': 3, 'nb_tiles': 1, 'raw_tiles': b'\x00\x01\x80'/17}
{'w-num': 1, 't-num': 2, 'nb_tiles': 1, 'raw_tiles': b'\xc9\x20\x00'/17}
{'w-num': 1, 't-num': 1, 'nb_tiles': 1, 'raw_tiles': b'\x46\x80\x00'/17}
{'w-num': 1, 't-num': 0, 'nb_tiles': 1, 'raw_tiles': b'\x08\x00\x80'/17}
{'w-num': 2, 't-num': 6, 'nb_tiles': 1, 'raw_tiles': b'\x02\x06\x00'/17}
{'w-num': 2, 't-num': 5, 'nb_tiles': 1, 'raw_tiles': b'\x00\x00\x00'/17}
{'w-num': 2, 't-num': 4, 'nb_tiles': 1, 'raw_tiles': b'\x00\x00\x00'/17}
{'w-num': 2, 't-num': 3, 'nb_tiles': 1, 'raw_tiles': b'\x00\x01\x00'/17}
{'w-num': 2, 't-num': 2, 'nb_tiles': 1, 'raw_tiles': b'\xc8\xd2\x00'/17}
{'w-num': 2, 't-num': 1, 'nb_tiles': 1, 'raw_tiles': b'\x00\x07\x00'/17}
{'w-num': 2, 't-num': 0, 'nb_tiles': 1, 'raw_tiles': b'\x9d\xa4\x80'/17}
{'w-num': 3, 't-num': 6, 'nb_tiles': 1, 'raw_tiles': b'\x75\x20\x00'/17}
{'w-num': 3, 't-num': 5, 'nb_tiles': 1, 'raw_tiles': b'\x00\x17\x00'/17}
{'w-num': 3, 't-num': 4, 'nb_tiles': 1, 'raw_tiles': b'\x10\xf8\x80'/17}
{'w-num': 3, 't-num': 3, 'nb_tiles': 1, 'raw_tiles': b'\x56\x00\x00'/17}
{'w-num': 3, 't-num': 1, 'nb_tiles': 1, 'raw_tiles': b'\x4c\x3c\x00'/17}
{'w-num': 3, 't-num': 7, 'nb_tiles': 1, 'raw_tiles': b'\x18\x00'/16}
HEX: 6005b9b200103a4020010420c0dc100200000000000000792404680040040818000000000000b2348000e9da4ba900005c21f156000c130f
bytearray(b'`\x05\xb9\xb2\x00\x10:@ \x01\x04 \xc0\xdc\x10\x02\x00\x00\x00\x00\x00\x00\x00y$\x04h\x00@\x04\x08\x18\x00\x00\x00\x00\x00\x00\xb24\x80\x00\xe9\xdaK\xa9\x00\x00\!\xf1V\x00\x0c\x13\x0f')
This issue is about the changes to synchronize the branch master
with the branch develop
. It stems from a discussion between openschc contributors.
The main question is related to the API changes, which impact "connectors" or "lower layers" (e.g. examples/udp, examples/gateway, examples/simulator, examples/scapy , etc. )
In order to make the connectors work with the branch develop
some changes are necessary.
Some possibilities are listed in the last section, and feedback is welcome.
master
For reference, the architecture and API of openschc in the master
branch are "defined" (more or less accurately; the interface with the rule manager is incomplete, and the compression is not called in master
) in :
master
).develop
These are some changes between the two branches:
there have been changes related to the identifier/address management, i.e. SCHCProtocol
needs to have both the identifiers (addresses) of the core and the device (which itself is).
schc_recv
and schc_send
now take both addresses/identifiers as argumentsalso there is a semantics change in schc_recv
which returns a packet (when available) for Layer 3 (Upper Layer) instead of letting SCHCProtocol call the upper layer with the packet.
as a result of address/identifier management, and proper rule definition, internally, every used address is the in the format of the identifiers of the rule database, e.g. a string such as "lorawan:70b3d5499126b445"
:
SCHCProtocol.schc_send
, SCHCProtocol.schc_recv
, UpperLayer.recv_packet
, and LowerLayer.send_packet
(in design architecture.py
, check in actual code).there are changes in the scheduler to cancel all events related to one session src/net_sim_sched.py, e.g. addition of one method cancel_session
for schc_recv
, schc_send
: keep both semantics from master
and develop
, considering the one from develop as more internal than the ones from branch master
.
schc_recv
semantics (in master
) of letting SCHCProtocol pass the packet to the UpperLayer is consistent with the schc_send
semantics of letting SCHCProtocol pass the packets to the LowerLayer ; and is probably expected by the connectors developers.schc_recv
, schc_send
with branch master
semantics as schc_recv_from_lower
, schc_send_from_upper
schc_recv
, schc_send
of develop
to _schc_recv
, _schc_send
;SCHCProtocol
from branch develop
as BaseSCHCProtocol
and creating a class SCHCProtocol
from branch master
inheriting from BaseSCHCProtocol
, and implementing the master
semantics.for the knowledge of own address/id in the SCHCProtocol (for the version of master
of schc_recv
/schc_send
):
LowerLayer.get_address
position
is also passed to the constructor, so one should consider if this is consistent.for the addresses/identifiers:
develop
and master
are needed it is possible to use inheritance+template methods, or to use object composition to have an optional address conversion object).for the cancel_session
, some possibilities:
cancel_session
inside the SCHC and fragmentation code, not inside the scheduler. Rationale: the semantics of the SCHC sessions are unrelated to the semantics of the scheduler.cancel_session
to cancel_event_group
(introducing the equivalent concept of event group
)
master
, this is not an urgent issue.Your help in selecting the changes would greatly welcomed. Thanks!
It is not a case in general. But, my micropython doesn't have contextlib.
Is it acceptable on your micropython ? If not, the core SCHC codes which is under the src folder should have only features supported by micropython.
% micropython
MicroPython v1.12-35-g10709846f on 2020-08-03; darwin version
Use Ctrl-D to exit, Ctrl-E for paste mode
import sys
sys.version_info
(3, 4, 0)
import contextlib
Traceback (most recent call last):
File "", line 1, in
ImportError: no module named 'contextlib'
Hi,
I've added the no compressed rule and it works fine, the receivers receives fragments and do reassembly, then call the decompression which remove the nocompression rule ID
but then, the packet is lost. In the code of frag_recv.py, we have line 205
if not self.protocol.config.get("debug-fragment"):
# XXX
# XXX in hack105, we have separate databases for C/D and F/R.
# XXX need to merge them into one. Then, here searching database will
# XXX be moved into somewhere.
# XXX
#rule = self.protocol.rule_manager.FindRuleFromSCHCpacket(schc=schc_packet)
#dprint("debug: no-ack FindRuleFromSCHCpacket", rule)
uncompressed = self.protocol.process_decompress(schc_packet, self.sender_L2addr, "UP")
self.state = 'DONE_NO_ACK'
print ("return uncompress packet")
dprint(self.state)
return
originaly uncompressed variable didn't exist. I don't know how to give it to the upper layer. Help!!!
@MarinoMtz I have a hard time understanding what stats/
is. Given that there is an __init__.py
, I assume it's package which can be imported. I'm suprised to see log.txt
which appears to be an artifact. I do see that it is imported a bit everywhere. I would recommend to, at least:
README.md
to explain what it is and how it is usedThere is no No Compression handling in schc_recv in protocol.py. A patch is currently on ltn22/openschc in connector branch
Someone wants to show DLMS over SCHC over LoRaWAN.
The DLMS-COMSEM implementation is here.
https://github.com/epri-dev/DLMS-COSEM
Could we make it ?
Make the code in tests
/ more Pythonic
Why is there a test in protocol.py to test the device? For me we are independant of the device at this level.
if dev_L2addr == b"\xaa\xbb\xcc\xee":
if frag_rule[T_FRAG][T_FRAG_PROF][T_FRAG_DTAG] > 0:
dtag = packet_bbuf.get_bits(frag_rule[T_FRAG][T_FRAG_PROF][T_FRAG_DTAG],
position=frag_rule[T_RULEIDLENGTH])
else:
dtag = None
....
Hi,
I'm a Scapy maintainer, and have bumped into this project because it was linked in https://github.com/phaethon/kamene/network/dependents
I've noticed that you were using kamene
. I'm assuming you were using scapy-python3
before it was renamed. Please note that kamene
is NOT an official Scapy version. Quoting https://scapy.net/
An independent fork of Scapy was created from v2.2.0 in 2015, aimed at supporting only Python3 (scapy3k). The fork diverged, did not follow evolutions and fixes, and has had its own life without contributions back to Scapy. Unfortunately, it has been packaged as python3-scapy in some distributions, and as scapy-python3 on PyPI leading to confusion amongst users. It should not be the case anymore soon. Scapy supports Python3 in addition to Python2 since 2.4.0. Scapy v2.4.0 should be favored as the official Scapy code base. The fork has been renamed as kamene.
There are two issues:
I'd advise switching to scapy
= the original Scapy.
Thanks for your time !
PS: sorry if this feels copy/pasted. It is a bit :(
Because there is no full type value
modify the add rule to store values in bytearray and add a display function for each field regarding its type.
This issue is to help newcomers (or people like myself who did not follow the evolutions in the past 2 years) select the right branch to use. This is only my understanding:
master
where developments had been historically merged, but there have been no real developments since mid-2020.develop
. @twatteyne is also making the project into good shape (many tests, PyPI integration, CI, and a lot more).If one wants to use OpenSCHC, it is suggested:
develop
should be tried by newcomers. The example examples/scapy should be running (but requires some setup) and many tests (properly automated) are under quick updates to provide some running examples.master
has still several useful examples, such as examples/udp or examples/simulator or src/test_newschc.py but does not integrate compression (only fragmentation, and there are some bugs). Many of them will not run on develop
due to changes in API. Discussion to fix most of it is in #117 and is ongoing work.To push the documentation generated by sphinx, I would need an existing gh-pages branch on th main repository
@MarinoMtz what is schctest/
? Looks like lots and lots of code, apparently duplicating some of the core OpenSCHC code?!? If this is some dead code, I strongly suggest we remove the folder.
I just cloned the repository and followed the instruction to make the test:
_________________________________________________________________________________________ test_ruleman_01 __________________________________________________________________________________________
def test_ruleman_01():
# XXX actually, it is not test code right now.
RM = RuleManager()
RM.add_context(context1, rule1,rule2,rule3)
RM.add_context(context2, rule1)
print(RM._db)
#RM.add_rules(context1, [conflictingRule0])
#RM.add_rules(context1, [bogusRule0])
print(RM.find_context_bydevL2addr("AABBCCDD"))
E AttributeError: 'RuleManager' object has no attribute 'find_context_bydevL2addr'
tests/test_ruleman.py:72: AttributeError
--------------------------------------------------------------------------------------- Captured stdout call ---------------------------------------------------------------------------------------
[{'devL2Addr': 'AABBCCDD', 'dstIID': '2001:0db8:85a3::beef', 'comp': {"RuleID": 4, "RuleIDLength": 5, "Compression": []}, 'fragSender': {"RuleID": 4, "RuleIDLength": 3, "profile": {"MICAlgorithm": "RCS_RFC8724", "MICWordSize": 8, "L2WordSize": 8}, "Fragmentation": {"dir": "out", "FRMode": "AckOnError", "FRModeProfile": {"dtagSize": 2, "FCNSize": 3, "ackBehavior": "afterAll1", "WSize": 5, "windowSize": 7, "tileSize": 64}}}, 'fragReceiver': {"RuleID": 7, "RuleIDLength": 3, "Profile": {"MICAlgorithm": "RCS_RFC8724", "MICWordSize": 8, "L2WordSize": 8}, "Fragmentation": {"dir": "in", "FRMode": "NoAck", "FRModeProfile": {"dtagSize": 0, "WSize": 0, "FCNSize": 1, "windowSize": 1}}}}, {'devL2Addr': '', 'dstIID': '', 'comp': {"RuleID": 4, "RuleIDLength": 5, "Compression": []}}]
===================================================================================== short test summary info ======================================================================================
FAILED tests/test_frag.py::test_frag_ack_on_error_no_loss - assert "msg_type_queue -> ['SCHC_ACK_OK']" in "> Queue running event -> 0, callback -> _notify_start\n> 1 [schc] @0 recv-from-l3 None...
FAILED tests/test_frag.py::test_frag_ack_on_error_loss - assert "msg_type_queue -> ['SCHC_ACK_OK']" in "> Queue running event -> 0, callback -> _notify_start\n> 1 [schc] @2 recv-from-l3 None 20...
FAILED tests/test_frag.py::test_frag_no_ack_no_loss - assert 'SUCCESS: MIC matched' in "> Queue running event -> 0, callback -> _notify_start\n> 1 [schc] @4 recv-from-l3 None 2001:0db8:85a3:000...
FAILED tests/test_frag.py::test_frag_no_ack_loss - assert 'ERROR: MIC mismatched' in "> Queue running event -> 0, callback -> _notify_start\n> 1 [schc] @6 recv-from-l3 None 2001:0db8:85a3:0000:...
FAILED tests/test_ruleman.py::test_ruleman_01 - AttributeError: 'RuleManager' object has no attribute 'find_context_bydevL2addr'
=================================================================================== 5 failed, 8 passed in 0.21s ====================================================================================
For many months Master as been quite and all the changes appears in the develop branch, with the following modifications:
I propose you, for clarification to merge the develop to Master next week.
Hi,
in docs/General/UG_examples I save files related to examples on the user guide
-> I changed the name of udp_example to udp_schc which is sexier
-> I will add soon the documentation in the UG
I would like the program to finish when the file is sent. right now the device is blocked the core-server does not clear the buffer.
The type of prefix is in integer by Parser.parse(), but in bytes by the rule manager. There is a part to convert into a type for comparison in the source code internally. I think that It's better to keep it as a single type without converting the type.
Is is okey to adopt "bytes" type of prefix and IID in Parser.parse() ?
get_address() in the l2 layer is not needed anymore, correct ?
The rule format may have to change to allow several rules for fragmentation and compression for a specific device.
For me the constraints are:
In rulemanager.py, the rule is defined this way:
db = [
{
"devL2Addr": ..,
"dstIID": ..,
"comp": {
"ruleID": ..,
"ruleLength": ..,
"compression": { ...}
},
"fragSender": {
"ruleID": ..,
"ruleLength": ..,
"fragmentation": { ...}
}
"fragReceiver": {
"ruleID": ..,
"ruleLength": ..,
"fragmentation": { ...}
}
}, ...
]
In my view the structure should be more:
db = [
{
"devL2Addr": ..,
"devIP": " ", # mandatory
"rules": {
[{
ruleID: xxx,
ruleLength: xxx,
compression: {
[["fieldID", ..., ....],
["ffffff", ... ]
},
{
ruleID: xxx,
ruleLength: xxx,
fragmentation: {
"direction": "up",
parameters....
}
}...
},
{.....}
]
The source code and the documentation must be improved to mandate that a "no-compression" RuleID MUST be present in the Rule set.
Define an attribute for this special "compression" rule.
I used an include in the README.rst in examples/udp, because it made easier to build the Sphynx documentation this way.
GitHub does not process the include, therefore the README is incomplete as you browse the repo.
A workaround is suggested in https://gist.github.com/shaypal5/70044eeda587fae17b180e723496b057
OpensSCHC may use the type store in TV to define an operation. The goal is to align openSCHC with the YANG Data Model and TV stores only bytearray. Regarding the field ID the data can be displayed differently.
remove files such as test_basic.py that uses the add_rule function (old rule manager)
Extraneous files seem to have made their way into the master-hack105 branch.
They are testfile.txt and testfile_large.txt, in the src directory.
When the dust of the current code merge has settled, remember to delete these files.
add support for GitHub Actions to run all tests all the time
currently returns the first fragmentation rule, irrespective of the arguments.
Does anyone know what client_server_simulation.txt is ?
frag_send.py contains below code. Why and what ?
f = open("client_server_simulation.txt", "w+")
Trying to run the instructions in https://github.com/openschc/openschc/blob/master/src/README.md, I get the following error message:
$ python3 test_frag_new.py
<<<...>>>
Traceback (most recent call last):
File "test_frag_new.py", line 131, in
sim.run()
File "/Users/dominique/Desktop/Hackathon108/openschc/src/net_sim_core.py", line 337, in run
self.scheduler.run()
File "/Users/dominique/Desktop/Hackathon108/openschc/src/net_sim_sched.py", line 40, in run
callback(*args)
File "/Users/dominique/Desktop/Hackathon108/openschc/src/net_sim_core.py", line 240, in deliver_packet
count += self.send_packet_on_link(link, packet)
File "/Users/dominique/Desktop/Hackathon108/openschc/src/net_sim_core.py", line 307, in send_packet_on_link
node_to.event_receive(link.from_id, packet)
File "/Users/dominique/Desktop/Hackathon108/openschc/src/net_sim_core.py", line 96, in event_receive
self.layer2.event_receive_packet(sender_id, packet)
File "/Users/dominique/Desktop/Hackathon108/openschc/src/net_sim_layer2.py", line 80, in event_receive_packet
self.protocol.schc_recv(self.devaddr, packet)
File "/Users/dominique/Desktop/Hackathon108/openschc/src/protocol.py", line 243, in schc_recv
dtag_length = frag_rule[T_FRAG][T_FRAG_PROF][T_FRAG_DTAG]
TypeError: 'NoneType' object is not subscriptable
Per #119 (comment)
Hello,
We would like to make some cleaning in the repo. There is a lot of stale banches and currently only scapy is active. So we propose to delete all the unactive branches to keep only scapy as main. @dominique is the gh-page needed to generate the documation ?
Thanks.
Laurent
There are two copies of the same file:
gen_rulemanager.py
src/gen_rulemanager.py
Doing a diff
there are little differences sprinkled around.
Looking at the log:
src/gen_rulemanager.py
seems to be much more up to date, which changes from @ltn22 and @MarinoMtz dated back 25-May-2022gen_rulemanager.py
seems to have been copied out of src/
for an IETF hackathon and forgotten@MarinoMtz, should we remove gen_rulemanager.py
?
in the current fragmentation rule, there is no indication if the rule msut be used uplink or downlink. A new tag must be added to tell the direction
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.