Giter Site home page Giter Site logo

priitj / whitedb Goto Github PK

View Code? Open in Web Editor NEW
606.0 606.0 78.0 3.14 MB

WhiteDB memory database

Home Page: http://whitedb.org/

License: GNU General Public License v3.0

CSS 0.41% JavaScript 6.63% Shell 0.41% C 84.08% Python 2.63% Makefile 0.51% Java 1.19% HTML 0.64% Lex 0.60% Yacc 0.44% Batchfile 0.27% M4 2.18%

whitedb's People

Contributors

bclothier avatar ckaran avatar fsgeek avatar h0x91b avatar jmrenouard avatar jubnzv avatar mailiish avatar patrick-yjchen avatar priitj avatar tammet avatar ulimit1 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

whitedb's Issues

Feature: WebSocket + CBOR/BSON + BEEP Interfaces

Hi,

for latency and high performance throughput applications TCP is not enough. Speed matter and I think we would benefit from WebSockets, BSON and BEEP support.

I've researched for you:
WebSockets: http://www.aspl.es/nopoll/ or http://libwebsockets.org/trac/libwebsockets (Persistent Connection)
BEEP: http://www.aspl.es/vortex/ (Custom Protocol ontop WebSockets)
BSON: http://bsonspec.org/ (Binary Json)

If you add all this, it's essential to customers to use any kind of Message Passing Interfaces (MPI).
That's because solutions like zeromq help to reduce the load massively, by organizing incoming connections and reconnecting lost connections etc.

wg memory error: cannot attach to shared memory (No permission). with gtest

I am using WhiteDB with logging enabled. Under normal conditions everything is working as expected. But when I try to test my code with gtest-s I get wg memory error: cannot attach to shared memory (No permission). error when calling wg_attach_database(&DB_NAME[0], size) or wg_attach_logged_database(&DB_NAME[0], size).
The same test was working fine when I was using WhiteDB with logging disabled.
Are there any known issues with gtest-s? Or perhaps what exactly does this error points to?

Best regards
Aleksandergabriel

Python Module won't load (Kubuntu 15.10/Python 3.4)

I tried compiling WhiteDB with the Python bindings, but the module won't load.

Here is the steps I took to compile and install:

wget http://whitedb.org/whitedb-0.7.3.tar.g
tar -xvf whitedb-0.7.3.tar.gz 
cd whitedb-0.7.3.tar.gz 
./configure --with-python=/usr/bin/python3 --prefix=/usr/ # python3 is symlinked to python3.4
make
sudo make install

$PYTHONPATH:

$ python3 -c "import sys; print (sys.path)"
['', '/usr/lib/python3.4', '/usr/lib/python3.4/plat-x86_64-linux-gnu', '/usr/lib/python3.4/lib-dynload', '/usr/local/lib/python3.4/dist-packages', '/usr/lib/python3/dist-packages']

Proof files are install:

$ ls /usr/lib/python3.4/site-packages
__pycache__/  WGandalf.py  wgdb.a  wgdb.la*  wgdb.so*  whitedb.py

When I import whitedb, I get:

Traceback (most recent call last):
  File "/home/username/Code/infodump/db.py", line 3, in <module>
    import whitedb
ImportError: No module named 'whitedb'

I tried directly loading the module:

import importlib.machinery
db = importlib.machinery.SourceFileLoader('whitedb', '/usr/lib/python3.4/site-packages/whitedb.py').load_module()

Output:

Traceback (most recent call last):
  File "/home/username/Code/infodump/db.py", line 5, in <module>
    db = importlib.machinery.SourceFileLoader('whitedb', '/usr/lib/python3.4/site-packages/whitedb.py').load_module()
  File "<frozen importlib._bootstrap>", line 539, in _check_name_wrapper
  File "<frozen importlib._bootstrap>", line 1614, in load_module
  File "<frozen importlib._bootstrap>", line 596, in _load_module_shim
  File "<frozen importlib._bootstrap>", line 1220, in load
  File "<frozen importlib._bootstrap>", line 1200, in _load_unlocked
  File "<frozen importlib._bootstrap>", line 1129, in _exec
  File "<frozen importlib._bootstrap>", line 1471, in exec_module
  File "<frozen importlib._bootstrap>", line 321, in _call_with_frames_removed
  File "/usr/lib/python3.4/site-packages/whitedb.py", line 33, in <module>
    import wgdb
ImportError: No module named 'wgdb'

Any help would be appreciated.

Compiler flags for Python bindings

I've tried to build Python bindings (using compile.sh) and weeding through the compiler warnings and errors I figured that:

  • I had to specify -fPIC (otherwise it won't link wgdb.so).
  • The -march option has an unorthodox value (and was rejected, I had to remove it). Man page for gcc says:
       -march=name
           This specifies the name of the target ARM architecture.  GCC uses this name to determine what kind of
           instructions it can emit when generating assembly code.  This option can be used in conjunction with or
           instead of the -mcpu= option.  Permissible names are: armv2, armv2a, armv3, armv3m, armv4, armv4t,
           armv5, armv5t, armv5e, armv5te, armv6, armv6j, armv6t2, armv6z, armv6zk, armv6-m, armv7, armv7-a,
           armv7-r, armv7-m, iwmmxt, iwmmxt2, ep9312.

           -march=native causes the compiler to auto-detect the architecture of the build computer.  At present,
           this feature is only supported on Linux, and not all architectures are recognized.  If the auto-detect
           is unsuccessful the option has no effect.

Its current value is pentium4 which seems to be a mistake.

After I've done both, I could compile it, and the Python code using wgdb library ran (although I did no extensive testing). I'm using gcc (GCC) 4.7.2 20121109 (Red Hat 4.7.2-8).

Error whe trying to delete record

I am unable to delete record from database, regardless of method used in any database created.
I am working on Ubuntu 16.04, White database was installed from release from http://whitedb.org/download.html. So far every other functionality I was able to test works as expected.
I create database using c api and add some records. Using wgdb command line tool I can confirm both that records were created and any changes I tried to make to records were actually made. The problem arrives when I use:
wg_delete_record(DB, rec)
where DB is global pointer to database used everywhere else successfully and rec is pointer to record. Function returns -1, which refers to some error in database, but I am not able to understand what that error is and how to fix it.
-1 is returned from dbdata.c from function gint wg_delete_record(void* db, void *rec) , where in line 217
#ifdef USE_BACKLINKING if(*((gint *) rec + RECORD_BACKLINKS_POS)) return -1; #endif
returns -1.
I also tried to use wgdb tool to remove record, but while I get confirmation that rows were deleted, query shows deleted rows still being in the database.
Could you please explain to me why I get -1 back and how to fix it.

BUILD FAIL: configure cannot guess build type

The config.guess file contains in the source code is too old that fail to build on latest architecture, such as RISC-V. This can be fixed by generating new tarball by latest autotools.

Compilation Warnings on Windows (x64)

Hi, I'm getting the following warnings when I try to build a static library in Visual Studio 2017 for (x64) target, x86 doesn't produce warnings.

  1. Is it safe to ignore or suppress them?
  2. If not, what if any, changes should I be making to the source files?

1>------ Build started: Project: WhiteDB, Configuration: Debug x64 ------
1>yajl_all.c
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\yajl_all.c(1816): warning C4267: 'function': conversion from 'size_t' to 'unsigned int', possible loss of data
1>dbutil.c
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbutil.c(202): warning C4267: '=': conversion from 'size_t' to 'int', possible loss of data
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbutil.c(244): warning C4267: '=': conversion from 'size_t' to 'int', possible loss of data
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbutil.c(249): warning C4244: '=': conversion from 'wg_int' to 'int', possible loss of data
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbutil.c(293): warning C4244: '=': conversion from 'wg_int' to 'int', possible loss of data
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbutil.c(353): warning C4244: '=': conversion from '__int64' to 'int', possible loss of data
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbutil.c(357): warning C4244: '=': conversion from 'wg_int' to 'int', possible loss of data
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbutil.c(378): warning C4244: '=': conversion from 'wg_int' to 'int', possible loss of data
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbutil.c(379): warning C4244: '+=': conversion from 'wg_int' to 'int', possible loss of data
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbutil.c(488): warning C4244: 'function': conversion from 'gint' to 'int', possible loss of data
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbutil.c(491): warning C4244: 'function': conversion from 'gint' to 'int', possible loss of data
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbutil.c(526): warning C4244: 'function': conversion from 'gint' to 'int', possible loss of data
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbutil.c(529): warning C4244: 'function': conversion from 'gint' to 'int', possible loss of data
1>dbschema.c
1>dbquery.c
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbquery.c(254): warning C4244: '=': conversion from 'gint' to 'int', possible loss of data
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbquery.c(290): warning C4244: '=': conversion from 'wg_int' to 'int', possible loss of data
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbquery.c(420): warning C4244: '=': conversion from 'gint' to 'int', possible loss of data
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbquery.c(1185): warning C4267: 'function': conversion from 'size_t' to 'int', possible loss of data
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbquery.c(1195): warning C4267: 'function': conversion from 'size_t' to 'int', possible loss of data
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbquery.c(1205): warning C4267: 'function': conversion from 'size_t' to 'int', possible loss of data
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbquery.c(1251): warning C4267: '=': conversion from 'size_t' to 'int', possible loss of data
1>dbmpool.c
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbmpool.c(346): warning C4267: '=': conversion from 'size_t' to 'int', possible loss of data
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbmpool.c(348): warning C4267: '=': conversion from 'size_t' to 'int', possible loss of data
1>dbmem.c
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbmem.c(312): warning C4244: '=': conversion from 'gint' to 'int', possible loss of data
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbmem.c(676): warning C4133: 'function': incompatible types - from 'char [100]' to 'LPCWSTR'
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbmem.c(741): warning C4244: 'function': conversion from 'gint' to 'DWORD', possible loss of data
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbmem.c(742): warning C4133: 'function': incompatible types - from 'char [100]' to 'LPCWSTR'
1>dblog.c
1>dblock.c
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dblock.c(282): warning C4133: 'function': incompatible types - from 'volatile gint *' to 'volatile LONG *'
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dblock.c(282): warning C4244: 'function': conversion from 'gint' to 'LONG', possible loss of data
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dblock.c(362): warning C4133: 'function': incompatible types - from 'volatile gint *' to 'volatile LONG *'
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dblock.c(362): warning C4244: 'function': conversion from 'gint' to 'LONG', possible loss of data
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dblock.c(453): warning C4133: 'function': incompatible types - from 'volatile gint *' to 'volatile LONG *'
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dblock.c(453): warning C4244: 'function': conversion from 'gint' to 'LONG', possible loss of data
1>dbjson.c
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbjson.c(262): warning C4244: '=': conversion from 'gint' to 'int', possible loss of data
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbjson.c(218): warning C4267: 'initializing': conversion from 'size_t' to 'int', possible loss of data
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbjson.c(292): warning C4267: '=': conversion from 'size_t' to 'int', possible loss of data
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbjson.c(405): warning C4267: '=': conversion from 'size_t' to 'int', possible loss of data
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbjson.c(762): warning C4244: 'return': conversion from 'gint' to 'int', possible loss of data
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbjson.c(770): warning C4244: 'return': conversion from 'gint' to 'int', possible loss of data
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbjson.c(778): warning C4244: 'return': conversion from 'gint' to 'int', possible loss of data
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbjson.c(785): warning C4244: 'return': conversion from 'gint' to 'int', possible loss of data
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbjson.c(798): warning C4244: 'return': conversion from 'gint' to 'int', possible loss of data
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbjson.c(808): warning C4244: 'return': conversion from 'gint' to 'int', possible loss of data
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbjson.c(815): warning C4244: 'return': conversion from 'gint' to 'int', possible loss of data
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbjson.c(810): warning C4244: 'initializing': conversion from 'wg_int' to 'int', possible loss of data
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbjson.c(843): warning C4244: 'return': conversion from 'gint' to 'int', possible loss of data
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbjson.c(855): warning C4244: 'return': conversion from 'gint' to 'int', possible loss of data
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbjson.c(860): warning C4244: 'return': conversion from 'gint' to 'int', possible loss of data
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbjson.c(838): warning C4244: 'initializing': conversion from 'wg_int' to 'int', possible loss of data
1>dbindex.c
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbindex.c(232): warning C4244: '=': conversion from 'gint' to 'unsigned char', possible loss of data
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbindex.c(257): warning C4244: '=': conversion from 'gint' to 'unsigned char', possible loss of data
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbindex.c(2300): warning C4244: 'initializing': conversion from 'gint' to 'int', possible loss of data
1>dbhash.c
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbhash.c(195): warning C4244: '=': conversion from 'wg_int' to 'int', possible loss of data
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbhash.c(315): warning C4244: '=': conversion from 'wg_int' to 'int', possible loss of data
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbhash.c(361): warning C4244: '=': conversion from 'wg_int' to 'int', possible loss of data
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbhash.c(601): warning C4334: '<<': result of 32-bit shift implicitly converted to 64 bits (was 64-bit shift intended?)
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbhash.c(637): warning C4334: '<<': result of 32-bit shift implicitly converted to 64 bits (was 64-bit shift intended?)
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbhash.c(692): warning C4334: '<<': result of 32-bit shift implicitly converted to 64 bits (was 64-bit shift intended?)
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbhash.c(697): warning C4334: '<<': result of 32-bit shift implicitly converted to 64 bits (was 64-bit shift intended?)
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbhash.c(750): warning C4334: '<<': result of 32-bit shift implicitly converted to 64 bits (was 64-bit shift intended?)
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbhash.c(762): warning C4244: '=': conversion from 'gint' to 'int', possible loss of data
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbhash.c(788): warning C4334: '<<': result of 32-bit shift implicitly converted to 64 bits (was 64-bit shift intended?)
1>dbdump.c
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbdump.c(142): warning C4244: 'function': conversion from '__int64' to 'long', possible loss of data
1>dbdata.c
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbdata.c(1119): warning C4244: '=': conversion from 'wg_int' to 'int', possible loss of data
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbdata.c(1617): warning C4013: 'floor' undefined; assuming extern returning int
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbdata.c(1665): warning C4244: 'return': conversion from 'wg_int' to 'int', possible loss of data
1>dbcompare.c
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbcompare.c(123): warning C4244: 'initializing': conversion from 'wg_int' to 'int', possible loss of data
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbcompare.c(124): warning C4244: 'initializing': conversion from 'wg_int' to 'int', possible loss of data
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbcompare.c(256): warning C4244: 'initializing': conversion from 'wg_int' to 'int', possible loss of data
1>d:\ipc\whitedblib\whitedb\whitedb\src\source\dbcompare.c(257): warning C4244: 'initializing': conversion from 'wg_int' to 'int', possible loss of data
1>dballoc.c
1>Generating Code...
1>WhiteDB.vcxproj -> D:\IPC\WhiteDBLib\WhiteDB\x64\Debug\WhiteDB.lib
1>Done building project "WhiteDB.vcxproj".
========== Build: 1 succeeded, 0 failed, 0 up-to-date, 0 skipped ==========

Suggest a locking mechanism

Hi,

I have read about Whitedb and yes it is a great database for graph application.
Currently Whitedb use read-write lock, so:

a. Reader will be blocked by another Reader
b. Reader will be blocked by another Writer
c. Writer will be blocked by another Writer

We can eliminate "a" and "b" with a technique called "Append-Only/Copy-On-Write", used by LMDB. This is not like complex MVCC implemented in relational database, like MySQL. But this allow reader without blocked by writer.

You can read the detail in the papper here: http://symas.com/mdb/

What about this idea?

commercial licence

Hello

I know this is not the right channel for this kind of question, but unfortunately I was unable to get in touch with any for the contacts through channels described on web site.
So the question is, is it still possible to get commercial licence and to who can I turn for more information?

Best regards
Aleš

Help! About Utilities tool

What is it work?

  1. wgdb query col "cond" value .. - basic query.
  2. wgdb del col "cond" value .. - like query. Matching rows are deleted from database.

I don't understand what is col and cond and how to delete row when i have to add value in shm?

Show me the example please. Thank you.

wg_database_freesize result never decreases

Hello
First of all, thanks for this project.

I'm trying to implement cache on top of whitedb, but found that even after deleting all records, wg_database_freesize result never decreases.
Is there way to estimate real free size?

dump files not working after reboot

I am having problem with importing dump files and accessing records after I reboot computer. Dump files are created with wg_dump function and imported with wg_import_dump(). When I was testing creating and importing dump files within one computer "run" everything was working as it should. But after a restart of a computer import function returns 0 as if everything is ok, but afterwards I get:
wg memory error: memory_stats(): failed to get shmid.
wg memory error: Failed to get the access mode of the segment.
Should I store dump file somehow differently?

Python installation on windows

Can we please get pip/easy_install support on windows, preferably with pre-compiled libraries so that the setup does not need Visual C++ compiler setup.

Also:

  1. After compiling on windows with python 2.7.3, I get this when I run tests.py:

................wg data handling error: wrong field number given to wg_set_field

.

Ran 17 tests in 1.137s

OK

  1. Is there a python example somewhere that uses whitedb as a key, value store like Redis (preferably using whitedb.py). I currently use Redis on a single node, and am trying to see if I can use whitedb.

Undefined symbols: _wg_compare_and_swap

Hi,
Tried compiling whitedb on Mac OS X but I kept crashing on the linking phase of make.

Here are the full outputs from both ./configure and make: https://gist.github.com/joshleaves/4a7baa64372273b704eb

Issue seems to come from declaring wg_compare_and_swap as inline. When removing it from prototype and declaration, compilation works fine.

My knowledge on GCC options and C standards is a bit rusty but apparently, depending on the standard (tried compiling with "-std=c99" too), declaring a function inline prevents it from being accessed from another file (in this case, dbdata.c).

My suggestion would be to remove the inline keyword but this could effect performance. I've tried multiple combinations of keywords ("extern", as some articles recommended) but nothing could make compilation work. I'm all ears to work on tests if you have a better idea to keep the "inline" and making it work.

Thanks for your help,

compile JNI error on my mac osx

first,i have to change the compile.sh and compile_bridge.sh by setting gcc -O2 -march=pentium4 to gcc -O2 -march=x86-64

➜ jni git:(master) ✗ java -version
java version "1.7.0_45"
Java(TM) SE Runtime Environment (build 1.7.0_45-b18)
Java HotSpot(TM) 64-Bit Server VM (build 24.45-b08, mixed mode)

➜ jni git:(master) ✗ ./compile_java.sh
➜ jni git:(master) ✗ ./compile_bridge.sh
ld: can't map file, errno=22 file '/Library/Java/JavaVirtualMachines/jdk1.7.0_45.jdk/Contents/Home/include/darwin' for architecture x86_64
collect2: error: ld returned 1 exit status

Passing list to arglist returns 0 results in Python 3.4

Here's the funtion:

class DB():
...
    def search_notes(self, query):
        query = query.split(' ')
        print(query) # debug
        args = [(0, wgdb.COND_EQUAL, 'tag')]

        for i in query:
            args.append((1, wgdb.COND_EQUAL, i))

        self.cursor.execute(arglist=[(0, wgdb.COND_EQUAL, 'tag'), (1, wgdb.COND_EQUAL, 'note')]) # debug
        print((tuple(self.cursor.fetchone()))) # debug
        print(args) # debug
        self.cursor.execute(arglist=args)
        self.cursor.fetchall()
        print((tuple(self.cursor.fetchall()))) # debug

Here's my test:

import db


mainDB = db.DB()
mainDB.create_note('This is a note', ['note', 'programming', 'python', 'whitedb'])
notes = mainDB.search_notes('note')

This is the output:

['note'] # Value of 'query'.
('tag', 'note', <whitedb.Record object at 0x7f36e12b9208>) # Manual search result
[(0, 1, 'tag'), (1, 1, 'note')] # Value of 'args'
() # 'args' search result

32 and 64 bit process error

For example i have 2 processes - wgdb86.exe (with 86 dll built) and wgdb64.exe (with 64 bit dll built). I cannot create shared db in 86 and add rows to it from 64 bit app.

Incorrect consistency check

There is a consistency check in wg_free_object, where it checks if the previous entry is free and is the expected size. This check has two bugs: (1) it checks the size of prevobject when it should check prevobjecthead (so the consistency check isn't doing what it should be) but this bug is masked by the fact that the ! on the getfreeobjectsize call doesn't bind to the entire conditional check. PR forthcoming.

Segmentation fault when making query and null query question

I am having problems with generating queries containing strings.
I always use query_arg array, so my first question would be if wg_query *query = wg_make_query(DB, NULL, 0, query_arg, 1); is correct form, or should I provide something else instead of NULL for void *matchrec parameter? I cant quite understand what is it for from documentation.
Secondly I am experiencing segmentation fault in file dbquery.c in line 1229 (function: static gint encode_query_param_unistr(), line: return encode_shortstr_offset(ptrtooffset(db, dptr));)
I am adding string to query_arg array like this:
query_arg[0].value = wg_encode_query_param_str(DB, &record_type[0], NULL);
Is this ok, or should I pass something else instead of NULL for char *lang.

Ordering of result of wg_find_record_int function

The documentation has it that this function will return a pointer to the first matching record.

  1. Does it mean that the first matching record is the one that was created first?
  2. Do subsequent calls to the function to get all matching records return records in the order in which they were created?
  3. If not, are there ways to sort the results of a query according to the order in which they were created?

Add setup.py to the python bindings

In time you'll want to release the python bindings on PyPi to be installed using pip.
For now a setup.py file should be added so in the future when the first version of the bindings will be ready you'll be able to release it.

DB names

I'm using logged whitedb, creating several databases with names like 111, 112, 211, 212, etc.
And found strange thing - if database name started with any number, apart from 1, then there is no journal files created. Also I have feeling that this databases intercects with the ones, who's names started with one.

Issue deleting a record who has the same value in a Template Index Column as another record

I have encountered an issue where I am unable to delete a record if the record has the same value as another record in a column that has a template index on it. So, for example lets assume I have 3 entries in my table:
Record #: [0 , 1 , 2]
Record 1: [10, 0, 1.5]
Record 2: [10, 1, 2.5]
Record 3: [10, 1, 3.5]
I can delete record 1, but not 2 or 3. Calling wg_delete_record on either of those returns a "-3". This only occurs if I have created a Templated index on the table for column 1 for entries whose column 0 == some value (e.g. 10).

I have attached zip file containing the source code for a program that reproduces this issue fairly easily. I have also attached my config.h that I built whitedb with.

The test does the following:

  1. Create the database
  2. Add a simple index on the 0th column. No Issue
  3. Run a test with a single iteration. No issue
  4. Run a test with two iterations. No issue
  5. Add a templated index on the 1th column that only applies to values that equal the table id
    • This is the root cause of the issue
  6. Run a test with a single iteration. No issue
  7. Run a test with a two iterations.
    • The delete command will fail to delete one of the records that has the same value in the 1th column as another record. We will receive a -3

Thank you for any help that you can provide.

reproducer.zip

I can't install WhiteDB Please Help

I can't install whitedb in my linux when i use

./configure; make; make install

it's up that

-bash: ./configure: No such file or directory
make: *** No targets specified and no makefile found. Stop.
make: *** No rule to make target `install'. Stop.

sorry i'm bad Engilsh.
Help me please this is my project term. Thank you.

A list of bugs found by static analyzer?

Hi all,
This is Qihoo360 CodeSafe Team, we found some bugs in whitedb. Since I'm unfamiliar with whitedb, can you help me confirm all this?

1. Null pointer dereference.

a null pointer dereference problem, see

while(*next_offset) {
. The call wg_decode_record(db, data) at line 257 may return null pointer, and this null pointer is assigned to next_offset with offset at line 258.

2. Unused values
There are some unused values in whitedb.

The value assigned to tmp is unused, see https://github.com/priitj/whitedb/blob/master/Db/dballoc.c#L245.
The value assigned to dvsize is unused, see https://github.com/priitj/whitedb/blob/master/Db/dballoc.c#L1085.
The value assigned to dataptr is unused, see https://github.com/priitj/whitedb/blob/master/Db/dbdata.c#L2759.
The value assigned to strdata is unused, see https://github.com/priitj/whitedb/blob/master/Test/dbtest.c#L5190.
The value assigned to tok is unused, see https://github.com/priitj/whitedb/blob/master/json/yajl_all.c#L1432.

Thanks in advance!

Packaging issue

Small issue,

The release file downloaded from http://whitedb.org/whitedb-0.7.3.tar.gz is not actually gzipped. Not an issue at all, but the .gz suffix makes you think it is.

BTW, this project seems to be dormant for the last year or so. Please don't give up on it!! It's such a great piece of software.

Add higher concurrency to the benchmarks

Redis and MongoDB are extensively used in the web applications where the concurrency requirements are much higher than 8 threads due to the amount of parallel requests for data. Therefor, higher concurrency values should be added to the benchmark.

You can also leave instructions to how you did the benchmark exactly and I'll do it and share the results.

Image processing

Hey
We are looking for the right way to share a buffer of images between different processes in our system to do real time decision. I'm wondering if Whitedb is a good choice for sharing an image matrix. What do you think? Can you give me a direction for how to use it in this particular case or If not do you have by chance any other suggestions?

Edit we mainly look for some possibly to do circular buffering, and maybe a pubsub design like Redis has but we are open in mind to other options as well

confusing error message [wg data handling error: wrong field number given to wg_get_field #]

I started to experience some error outputs regarding wg_get_field function. Error message reads like: wg data handling error: wrong field number given to wg_get_field 5, where number changes between 4, 5 and 6 as far as I can tell.
I assumed I was passing wrong field position integer to wg_get_field function, but after some line by line search with debugger I realized that parameters given to wg_get_field are correct and filed number is not connected to number in error message. Also after decoding field I get correct result.
Is number connected with some error code, or is there something else wrong that I am missing at the moment?

http://whitedb.org/ is down

Forbidden
You don't have permission to access / on this server.
Apache/2.4.39 Server at whitedb.org Port 80

Something's wrong on the server ... I know the documentation is also in the repo. But since the readme's here point to that URL probably something that should be fixed.

Switch the domain to GitHub, so you don't have to worry about hosting. It seems that's the likely problem. I know I've forgotten to pay the bill on sites before ...

How to update a query

I created JSON data from a file and insert it into the database as structured records. How am I update that structured record. Still I have not find any API functions for that .

This is my JSON file

        {
           "userDetails":
             {
                      "username":"Jobs",
                      "password":"abc123",
                      "cookie":"fdsghfdshgfjghfkjghjkjhl",
              },
         "applicationsInfo":[
          {
              "application":"app1",
              "tier":"gold",
              "consumerKey" :"werewr4sdrgfdsy5tgfhgfhjtg7gjh",
          }
         ]
       }

I want to push application details to applicationsInfo and want to update cookies . Can I do those operations with current API .

undefined reference to wg_create_index

Hi, I have downloaded the latest version , configure and compiled .
everything worked perfect, creating records and reading.
But when i try to add indexing i got the linking error:
undefined reference to `wg_create_index(void*, long, long, long*, long)'
i have included the indexapi.h so the compilation worked but somehow the linker
doesn't see this api nor does it see wg_int wg_create_multi_index.
Please help.

License Question

Hello, my team and I were wondering if there would be any possibility of providing WhiteDB with an MIT (or MIT equivalent) license option. We really like WhiteDB, its performance, and its API However, GPLv3 is too restrictive for our use and unfortunately, due to factors outside of our control, pursuing a commercial license is also very difficult.

I look forward forward to hearing from you. Thank you,
Eric Mammoser

Error when restoring the database (with indexing) from the log file.

I have a problem when writing to the database after restoring it from the log file. The base is used with indexing (without indexing - works.). I suspect that the problem lies in the too large value (1660303687824186088) that I use. (wg_int = -9223372036854775808 ... 9223372036854775807).
After the base is lifted from the log, I record a new element and get a crash (Segmentation fault (core dumped)).
Woops! Crashed! signal 11
0# 0x00007F24336E8896 in /usr/local/lib/swarm/libkitten_logger.so
1# 0x00007F24332EF0C0 in /lib/x86_64-linux-gnu/libc.so.6
2# wg_decode_int in /usr/local/lib/libwgdb.so.0
3# wg_compare in /usr/local/lib/libwgdb.so.0
4# wg_search_ttree_rightmost in /usr/local/lib/libwgdb.so.0
5# 0x00007F2432C1E271 in /usr/local/lib/libwgdb.so.0
6# wg_index_add_field in /usr/local/lib/libwgdb.so.0
7# wg_set_field in /usr/local/lib/libwgdb.so.0

I think somewhere when lifting from the log there is an integer overflow.
Attached is the test file.
test_wdb.zip

Also there is an integer overflow when outputting "wgdb select". I am attaching the patch.
patch.zip


indexapi.h missing C extern for C++

When including the index API in my C++ code I was not able to find definitions for any of the index API functions. I solved this by adding an "extern "C" { }" around my include for indexapi.h. I think this extern should be included in the file itself so that users of the API don't need to do it themselves.

__asm is not allowed in 64-bit msvc builds

This patch should fix the issue:

diff --git a/Db/dblock.c b/Db/dblock.c
index bb7d4d4..21d148b 100644
--- a/Db/dblock.c
+++ b/Db/dblock.c
@@ -92,9 +92,8 @@ extern "C" {
 #define MM_PAUSE
 #endif
 #elif defined(_WIN32)
-#define MM_PAUSE {\
-  __asm {_emit 0xf3}; __asm{_emit 0x90};\
-}
+#include <emmintrin.h>
+#define MM_PAUSE { _mm_pause(); }
 #endif

 /* Helper function for implementing atomic operations

Clarification

"Records are encoded as an offset from the start of the shared memory segment to the start of the record."
It is not checked if the wg_set_field(,, wg_int data) exists.
So is it safe if I have 2 databases A and B.
Encode a record(wg_encode_record) from A and store it in a field of B as a record link to A.
I did some tests, everything seemed fine, are the IDs of the encoding fixed or can there change if any kind of reordering happen?

Question: find first missing entry

I've read docs, sources, but couldn't quite figure out how to do it.

For example, given a database has 6 one-field records with numbers: {10, 0, 1, 2, 27, 5}.

What can I use to find the smallest number that is not in database? E.g. in above that would be 3.

I imagine, some kind of iteration is needed, and I found that wg_query even has *_offset fields. But I didn't find any functional for that, and from inspecting in gdb I couldn't make sense from offsets either (I've seen end_offset being much smaller than curr_offset).

Support for extending database size?

As I understand it, WhiteDB require that size parameter be provided when creating/loading the database. This doesn't work well in scenarios where the size isn't known in advance and could vary considerably.

If it's not practical to implement extending an open database's memory space, what strategies are available for allocating more memory on demand in a performant manner?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.