poke1024 / bbz-segment Goto Github PK
View Code? Open in Web Editor NEWCode and data for the paper at http://arxiv.org/abs/2004.07317
License: GNU General Public License v2.0
Code and data for the paper at http://arxiv.org/abs/2004.07317
License: GNU General Public License v2.0
Hey,
Sorry to create another issue but I figured it was the best way to ask this. I was wondering if you would consider adding a license to the repo on how the code can/cannot be re-used etc.
I'm a personal fan of GPL2(Mainly because it requires keeping any modifications OS as well) and MIT.
Thank you!
Hey!
I am trying to load the pre-trained model provided via DropBox to adapt it a bit but it fails to load. Initially it was an issue with the custom activation function but I fixed that by passing it as a custom_object
to load_model
in predit.py
, see here for how I went about it.
But now I am running into a new problem, I get the following error(emphasis on the last line):
Traceback (most recent call last):
File "05_prediction/src/main.py", line 22, in <module>
], models_path=models_path)
File "/Users/user/Documents/spark/CS501-Liberator-Project/bbz-segment/05_prediction/src/predict.py", line 115, in load
loaded[name] = c(name, **kwargs)
File "/Users/user/Documents/spark/CS501-Liberator-Project/bbz-segment/05_prediction/src/predict.py", line 155, in __init__
custom_objects={'swish': tf.nn.swish, 'FixedDropout': FixedDropout})
File "/Users/user/.local/share/virtualenvs/bbz-segment-czho4_QW/lib/python3.7/site-packages/tensorflow_core/python/keras/saving/save.py", line 146, in load_model
return hdf5_format.load_model_from_hdf5(filepath, custom_objects, compile)
File "/Users/user/.local/share/virtualenvs/bbz-segment-czho4_QW/lib/python3.7/site-packages/tensorflow_core/python/keras/saving/hdf5_format.py", line 168, in load_model_from_hdf5
custom_objects=custom_objects)
File "/Users/user/.local/share/virtualenvs/bbz-segment-czho4_QW/lib/python3.7/site-packages/tensorflow_core/python/keras/saving/model_config.py", line 55, in model_from_config
return deserialize(config, custom_objects=custom_objects)
File "/Users/user/.local/share/virtualenvs/bbz-segment-czho4_QW/lib/python3.7/site-packages/tensorflow_core/python/keras/layers/serialization.py", line 106, in deserialize
printable_module_name='layer')
File "/Users/user/.local/share/virtualenvs/bbz-segment-czho4_QW/lib/python3.7/site-packages/tensorflow_core/python/keras/utils/generic_utils.py", line 303, in deserialize_keras_object
list(custom_objects.items())))
File "/Users/user/.local/share/virtualenvs/bbz-segment-czho4_QW/lib/python3.7/site-packages/tensorflow_core/python/keras/engine/network.py", line 937, in from_config
config, custom_objects)
File "/Users/user/.local/share/virtualenvs/bbz-segment-czho4_QW/lib/python3.7/site-packages/tensorflow_core/python/keras/engine/network.py", line 1903, in reconstruct_from_config
process_node(layer, node_data)
File "/Users/user/.local/share/virtualenvs/bbz-segment-czho4_QW/lib/python3.7/site-packages/tensorflow_core/python/keras/engine/network.py", line 1851, in process_node
output_tensors = layer(input_tensors, **kwargs)
File "/Users/user/.local/share/virtualenvs/bbz-segment-czho4_QW/lib/python3.7/site-packages/tensorflow_core/python/keras/engine/base_layer.py", line 773, in __call__
outputs = call_fn(cast_inputs, *args, **kwargs)
File "/Users/user/.local/share/virtualenvs/bbz-segment-czho4_QW/lib/python3.7/site-packages/tensorflow_core/python/keras/layers/core.py", line 846, in call
result = self.function(inputs, **kwargs)
File "/home/sc.uni-leipzig.de/bo140rasi/.local/lib/python3.7/site-packages/keras/utils/multi_gpu_utils.py", line 198, in get_slice
AttributeError: module 'tensorflow.python.keras.backend' has no attribute 'slice'
Which I can't seem to track down why it is happening. I presume it's stemming from something it is attempting to load for the pre-trained model in the HDF5 files. It seems that it's trying to find the slice
function but is unable to find that function, but I'm not even able to track down where the function call is originating from! I tried updating the base_dir
in the run.json
file in the model directory models\v3\sep\1\run.json
but this still does not fix the issue.
I am using the following:
Any help would be super appreciated!
Hi,
I have certain issues with package versions, specifically segmentation_models and keras. I would be very grateful if you could share the package requirements.
Thank you!
/data/py/origami/bbz-segment/02_preprocessing$ python main.py
out_bin_path/data/py/origami/00_demo_data/corpus/0000/bin/2436020X_1918-12-10_64_578_006.png
896x1280: 0%| | 0/1 [00:00<?, ?it/s][[ 1 -1 -1 -1]
[ 2 0 -1 -1]
[ 3 1 -1 -1]
[ 4 2 -1 -1]
[ 5 3 -1 -1]
[ 6 4 -1 -1]
[ 7 5 -1 -1]
[ 8 6 -1 -1]
[ 9 7 -1 -1]
[10 8 -1 -1]
[11 9 -1 -1]
[12 10 -1 -1]
[13 11 -1 -1]
[14 12 -1 -1]
[-1 13 -1 -1]]
error on generating data for /data/py/origami/00_demo_data/corpus/0000/ann/2436020X_1918-12-10_64_578_006.psd
896x1280: 0%| | 0/1 [00:03<?, ?it/s]
Traceback (most recent call last):
File "main.py", line 480, in
p.gen()
File "main.py", line 464, in gen
create_training_data((896, 1280), (896, 384))
File "main.py", line 445, in create_training_data
self._gen_train(inputs, codes_name, codes_mapping, full_size, tile_size)
File "main.py", line 405, in _gen_train
converter(gt_ref)
File "main.py", line 259, in call
augmentation)
File "main.py", line 293, in _psd
ground_truth = gt_ref.load(self._logger)
File "/data/py/origami/bbz-segment/02_preprocessing/preprocessing/gt.py", line 248, in load
return GroundTruthRef._loader(self, self.annotated_path, self.document_path, logger)
File "/data/py/origami/bbz-segment/02_preprocessing/preprocessing/gt.py", line 231, in call
gt.add_labels("regions", self._generate_regions(gt))
File "/data/py/origami/bbz-segment/02_preprocessing/preprocessing/gt.py", line 172, in _generate_regions
regions = annotations.regions()
File "/data/py/origami/bbz-segment/02_preprocessing/preprocessing/labels.py", line 471, in regions
return Regions(self.clabels, segments)
File "/data/py/origami/bbz-segment/02_preprocessing/preprocessing/labels.py", line 819, in init
self._tables = morpholizer.table_polygons()
File "/data/py/origami/bbz-segment/02_preprocessing/preprocessing/labels.py", line 1177, in table_polygons
micro_regions, macro_regions = self._table_regions_at_iterations(kernel, (2, 5))
File "/data/py/origami/bbz-segment/02_preprocessing/preprocessing/labels.py", line 1166, in _table_regions_at_iterations
results.append(_regions_to_convex_hull(table_mask).astype(numpy.uint8))
File "/data/py/origami/bbz-segment/02_preprocessing/preprocessing/labels.py", line 486, in _regions_to_convex_hull
polygons = mask_to_polygons(mask, convex_hulls=True)
File "/data/py/origami/bbz-segment/02_preprocessing/preprocessing/utils/init.py", line 90, in mask_to_polygons
return mask_to_contours(mask, cls=shapely.geometry.Polygon, **kwargs)
File "/data/py/origami/bbz-segment/02_preprocessing/preprocessing/utils/init.py", line 57, in mask_to_contours
hull = cv2.convexHull(c, returnPoints=False)
cv2.error: OpenCV(4.5.3) /tmp/pip-req-build-l1r0y34w/opencv/modules/imgproc/src/convhull.cpp:143: error: (-215:Assertion failed) total >= 0 && (depth == CV_32F || depth == CV_32S) in function 'convexHull'
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.