Giter Site home page Giter Site logo

bopc's People

Contributors

gannimo avatar ispoleet avatar lightninghkm avatar sei-eschwartz avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

bopc's Issues

TypeError: unhashable type: 'dict'

I'm setup with the same configuration as I've reported in issue #1.
I've also tried to run things without the cache:

    $ BINARY=~/src/3rd/BOPC/evaluation/nginx1
    $ PAYLOAD=~/src/3rd/BOPC/payloads/execve.spl
    $ ENTRY=0x41cd03
    $ ./source/BOPC.py -dd --binary $BINARY --source $PAYLOAD --entry $ENTRY --format gdb

I get a different kind of error:

    WARNING | 2019-01-23 23:34:00,256 | angr.analyses.disassembly_utils | Your version of capstone does not support MIPS instruction groups.

    %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
    %                                                                    %
    %                :::::::::   ::::::::  :::::::::   ::::::::          %
    %               :+:    :+: :+:    :+: :+:    :+: :+:    :+:          %
    %              +:+    +:+ +:+    +:+ +:+    +:+ +:+                  %
    %             +#++:++#+  +#+    +:+ +#++:++#+  +#+                   %
    %            +#+    +#+ +#+    +#+ +#+        +#+                    %
    %           #+#    #+# #+#    #+# #+#        #+#    #+#              %
    %          #########   ########  ###         ########                %
    %                                                                    %
    %                Block Oriented Programming Compiler                 %
    %                                                                    %
    %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%


    [*] Starting BOPC v2.1 at 23/01/2019 23:34
    [23:34:00,281] [+] Compiling '/home/tullsen/src/3rd/BOPC/payloads/execve.spl'...
    [23:34:00,281] [+] Parsing started.
    [23:34:00,284] [+] Parsing complete.
    [23:34:00,284] [+] Fixing jump/goto targets...
    [23:34:00,284] [+] Done.
    [23:34:00,284] [+] Semantic analysis started.
    [23:34:00,284] [+] Semantic analysis completed.
    [23:34:00,284] [+] Compilation completed.
    [23:34:00,284] [+] Optimizer started. Mode: 'none'
    [23:34:00,284] [+] Removing labels...
    [23:34:00,285] [+] Done.
    [23:34:00,285] [+] Optimization completed.
    [23:34:00,285] [+] Optimized IR:
    [23:34:00,285] [+] -------------------------------- @__0 --------------------------------
    [23:34:00,285] [+]  {'type': 'entry', 'uid': 0}
    [23:34:00,285] [+] -------------------------------- @__2 --------------------------------
    [23:34:00,285] [+]  {'type': 'varset', 'uid': 2, 'val': ['/bin/sh\x00'], 'name': 'prog'}
    [23:34:00,285] [+] -------------------------------- @__4 --------------------------------
    [23:34:00,285] [+]  {'type': 'varset', 'uid': 4, 'val': [('prog',), '\x00\x00\x00\x00\x00\x00\x00\x00'], 'name': 'argv'}
    [23:34:00,285] [+] -------------------------------- @__6 --------------------------------
    [23:34:00,285] [+]  {'reg': 0, 'type': 'regset', 'valty': 'var', 'val': ('prog',), 'uid': 6}
    [23:34:00,285] [+] -------------------------------- @__8 --------------------------------
    [23:34:00,286] [+]  {'reg': 1, 'type': 'regset', 'valty': 'var', 'val': ('argv',), 'uid': 8}
    [23:34:00,286] [+] -------------------------------- @__10 --------------------------------
    [23:34:00,286] [+]  {'reg': 2, 'type': 'regset', 'valty': 'num', 'val': 0, 'uid': 10}
    [23:34:00,286] [+] -------------------------------- @__12 --------------------------------
    [23:34:00,286] [+]  {'uid': 12, 'args': [0, 1, 2], 'dirty': ['rax', 'rcx', 'rdx', 'r10', 'r11'], 'alt': [], 'type': 'call', 'name': 'execve'}
    [23:34:00,513] [+] Generating CFG. It might take a while...
    WARNING | 2019-01-23 23:34:10,886 | angr.engines.successors | Exit state has over 256 possible solutions. Likely unconstrained; skipping. <BV64 global_c000014_17_64{UNINITIALIZED}>
    WARNING | 2019-01-23 23:34:11,001 | angr.engines.successors | Exit state has over 256 possible solutions. Likely unconstrained; skipping. <BV64 global_c00001d_35_64{UNINITIALIZED}>
    ...
    [23:34:18,090] [+] CFG generated.
    [23:34:18,090] [+] Normalizing CFG...
    [23:34:18,722] [+] Done.
    [23:34:18,731] [*] CFG has 24169 nodes and 44565 edges
    [23:34:18,776] [+] Basic block abstraction process started.
    defaultdict(<type 'list'>, {'pruned': [], 'deadended': [], 'active': [], 'unconstrained': [], 'errored': [], 'unsat': [], 'stashed': []})
    [23:34:18,840] [WARNING] Symbolic Execution at block 0x40218d failed: 'There are no usable stashes!' Much sad :( Skipping current block....
    defaultdict(<type 'list'>, {'pruned': [], 'deadended': [], 'active': [], 'unconstrained': [], 'errored': [], 'unsat': [], 'stashed': []})
    [23:34:20,692] [WARNING] Symbolic Execution at block 0x4029a5 failed: 'There are no usable stashes!' Much sad :( Skipping current block....
    defaultdict(<type 'list'>, {'pruned': [], 'deadended': [], 'active': [], 'unconstrained': [], 'errored': [], 'unsat': [], 'stashed': []})
    ...
    [23:55:30,960] [WARNING] Symbolic Execution at block 0x45d996 failed: 'There are no usable stashes!' Much sad :( Skipping current block....
    defaultdict(<type 'list'>, {'pruned': [], 'deadended': [], 'active': [], 'unconstrained': [], 'errored': [], 'unsat': [], 'stashed': []})
    [23:55:30,972] [WARNING] Symbolic Execution at block 0x45d9b0 failed: 'There are no usable stashes!' Much sad :( Skipping current block....
    [23:55:30,972] [+] 100% completed
    defaultdict(<type 'list'>, {'pruned': [], 'deadended': [], 'active': [], 'unconstrained': [], 'errored': [], 'unsat': [], 'stashed': []})
    [23:55:30,986] [WARNING] Symbolic Execution at block 0x45d9b4 failed: 'There are no usable stashes!' Much sad :( Skipping current block....
    [23:55:30,986] [+] Done.
    [23:55:30,986] [+] Searching CFG for candidate basic blocks...
    [23:55:30,986] [+] Creating vartab...
    [23:55:30,986] [+] Done.
    Traceback (most recent call last):
      File "./source/BOPC.py", line 447, in <module>
        X = mark.mark_candidate(sorted(map(lambda s : tuple(s.split('=')), args.mapping)))
      File "/home/tullsen/src/3rd/BOPC/source/mark.py", line 940, in mark_candidate
        nx.set_node_attributes(self.__rg, 'immutable', {'__r%d' % vr:1})
      File "/home/tullsen/.virtualenvs/bopc1/local/lib/python2.7/site-packages/networkx/classes/function.py", line 654, in set_node_attributes
        G.nodes[n][name] = values
    TypeError: unhashable type: 'dict'

what is entry?

I have read your paper, but I don't understand what it is? I am reproducing the attack of nginx, can you send your experience data to me? Please!

Can use Data Oriented Programming to open a shell?

Hi

The DOP paper in SP 2016 can just leak some informatoin.

I found that in your BOP paper, there is a execve payload of nginx.

Does that mean just using buffer overflow to modify some local variable rather then return address can open a shell?

It's amazing!!

Mapping id changed every time in the Proftpd example

Hi, I tried to run the proftpd example but no solution is found. I did find the required mapping but its id is different from that in the README.md file. Actually the mapping ids change every time when running BOPC, so the mapping-id I got from the previous execution with --enum-mappings is different from that in the later searching execution with --mapping-id. Any suggestion is welcome. Thanks!

could you give the command to get execve on nginx?

Hi, I'm trying to reproduce the experiment of getting execve on nginx. But bopc always returns 0 solution. Could you help provide specific instructions for this program?

The following is the command I tried:

./source/BOPC.py -dd --binary evaluation/nginx1 --source payloads/execve.spl --abstraction load --entry -1 --format gdb

It would be nice if you could point out the mistake above.

error networkx.exception.NetworkXPointlessConcept: ('Connectivity is undefined ', 'for the null graph.')

Thank you for answer my last question, gannimo !
Now I try to use the the beginning of ngx_signal_handler function as entry.
But when I use "../source/BOPC.py -dd --binary nginxl --source infloop.spl --abstractions save --entry 0x000000000041c750 --format gdb", the error accurs.
The error code is :

[21:04:27,701] [+] Trace searching algorithm started.
[21:04:27,701] [+] Enumerating all mappings between virtual and hardware registers
[21:04:27,701] [+]      and all mappings between variables and addresses...
G is null
Traceback (most recent call last):
  File "../source/BOPC.py", line 482, in <module>
    tsearch.trace_searching(mark)
  File "/home/dyc_lab/BOPC/source/search.py", line 916, in trace_searching
    rval = mapping.enum_mappings( self.__mapping_callback )
  File "/home/dyc_lab/BOPC/source/map.py", line 468, in enum_mappings
    ret = match.enum_max_matchings(self.__intrl_callback_reg, self.__nregs)
  File "/home/dyc_lab/BOPC/source/map.py", line 342, in enum_max_matchings
    if self.__callback( M ) < 0:
  File "/home/dyc_lab/BOPC/source/map.py", line 424, in __intrl_callback_reg
    if match.enum_max_matchings(self.__intrl_callback_var, self.__nvars) < 0:
  File "/home/dyc_lab/BOPC/source/map.py", line 299, in enum_max_matchings
    M = nx.bipartite.maximum_matching(self.__G)
  File "/home/dyc_lab/.virtualenvs/angr/local/lib/python2.7/site-packages/networkx/algorithms/bipartite/matching.py", line 146, in hopcroft_karp_matching
    left, right = bipartite_sets(G, top_nodes)
  File "/home/dyc_lab/.virtualenvs/angr/local/lib/python2.7/site-packages/networkx/algorithms/bipartite/basic.py", line 204, in sets
    if not is_connected(G):
  File "<decorator-gen-108>", line 2, in is_connected
  File "/home/dyc_lab/.virtualenvs/angr/local/lib/python2.7/site-packages/networkx/utils/decorators.py", line 73, in _not_implemented_for
    return not_implement_for_func(*args, **kwargs)
  File "/home/dyc_lab/.virtualenvs/angr/local/lib/python2.7/site-packages/networkx/algorithms/components/connected.py", line 162, in is_connected
    'for the null graph.')
networkx.exception.NetworkXPointlessConcept: ('Connectivity is undefined ', 'for the null graph.')

Firstly I thought it maybe that I set an error entry, but it's not. Then I think maybe the ELF(nginxl in evaluation, and it works well) wrong, so I compile the nginx-1.4.0, and find it still report this error.
Finally I thought if there are some wrong in CFG, for when analyze it reports hundreds times:

[WARNING] Symbolic Execution at block 0x471ded failed: 'There are no usable stashes!' Much sad :( Skipping current block....

I know maybe I need provide more infomation but I even don't know what is vital.
Please help me!

Can you give the corresponding entry point for the test sample you give?

When I run the test sample you give, I can't find a solution using various entry points. Take "sodu" as an example. According to your article, the entry point is the instruction address that can be reached after triggering AWP. I set it as the address of the next "free" instruction of "vsprintf" corresponding to the format string vulnerability, but I can't find a solution. Can you give more entry points used for testing in your article, such as "proftpd" entry point, "sudo" entry point, "nginx" entry point, or elaborate on the discovery rules of entry points, or explain how you determine these entry points?

Clarification on License

Hi, Thanks for the great work, and the excellent artefact! It really was very easy to set up and run locally!

I'm interested in building on top of this work --- could you clarify the license under which this has been released?

sorry to be a bother, I realise that this project hasn't been updated in a while.

"unwarranted" exceptions occurring

setup.sh did not work out of the box, my install is such:

pipenv install angr==7.8.9.26
pipenv install claripy==7.8.9.26
pipenv install networkx==1.11
   # got error here! so I did the next:
pipenv install networkx==2.1
pipenv install matplotlib
pipenv install simuvex
pipenv install graphviz==0.8.1
pipenv install pygraphviz==1.3.1

Now, when I do

__binary__=~/src/3rd/BOPC/evaluation/nginx1
./source/BOPC.py -dd -b $__binary__ -a saveonly

I get this

[*] Starting BOPC v2.1 at 10/11/2018 23:05
[23:05:24,881] [+] Generating CFG. It might take a while...
WARNING | 2018-11-10 23:05:36,854 | angr.engines.successors | Exit state has over 256 possible solutions. Likely unconstrained; skipping. <BV64 global_c000000_13_64{UNINITIALIZED}>
WARNING | 2018-11-10 23:05:36,927 | claripy.vsa.strided_interval | Reversing a real strided-interval <64>0x1[0x500000000000000, 0x5ffffffffffffff]R(uninit) is bad
WARNING | 2018-11-10 23:05:36,970 | angr.engines.successors | Exit state has
over 256 possible solutions. Likely unconstrained; skipping. <BV64
Reverse(unconstrained_read_25_64)>
...
WARNING | 2018-11-10 23:05:39,814 | angr.engines.successors | Exit state has over 256 possible solutions. Likely unconstrained; skipping. <BV64 global_8003f6780_312_64{UNINITIALIZED}>
WARNING | 2018-11-10 23:05:39,854 | angr.engines.successors | Exit state has over 256 possible solutions. Likely unconstrained; skipping. <BV64 global_c0001b4_317_64{UNINITIALIZED}>
[23:05:45,133] [+] CFG generated.
[23:05:45,134] [+] Normalizing CFG...
[23:05:45,885] [+] Done.
[23:05:45,897] [*] CFG has 24169 nodes and 44565 edges
[23:05:45,951] [+] Basic block abstraction process started.
defaultdict(<type 'list'>, {'pruned': [], 'deadended': [], 'active': [],
'unconstrained': [], 'errored': [], 'unsat': [], 'stashed': []})
...
[23:23:02,662] [WARNING] Symbolic Execution at block 0x45d996 failed: 'There are no usable stashes!' Much sad :( Skipping current block....
defaultdict(<type 'list'>, {'pruned': [], 'deadended': [], 'active': [], 'unconstrained': [], 'errored': [], 'unsat': [], 'stashed': []})
[23:23:02,674] [WARNING] Symbolic Execution at block 0x45d9b0 failed: 'There are no usable stashes!' Much sad :( Skipping current block....
[23:23:02,674] [+] 100% completed
defaultdict(<type 'list'>, {'pruned': [], 'deadended': [], 'active': [], 'unconstrained': [], 'errored': [], 'unsat': [], 'stashed': []})
[23:23:02,688] [WARNING] Symbolic Execution at block 0x45d9b4 failed: 'There are no usable stashes!' Much sad :( Skipping current block....
[23:23:02,689] [+] Done.
[23:23:02,689] [+] Saving basic block abstractions to a file...
Traceback (most recent call last):
  File "./source/BOPC.py", line 501, in <module>
    abstract(mark, args.abstractions, args.binary)
  File "./source/BOPC.py", line 341, in abstract
    mark.save_abstractions(filename)
  File "/home/tullsen/src/3rd/BOPC/source/mark.py", line 423, in save_abstractions
    pickle.dump(abstr, output, 0)           # pickle dictionary using protocol 0.
  File "/usr/lib/python2.7/pickle.py", line 1376, in dump
    Pickler(file, protocol).dump(obj)
  File "/usr/lib/python2.7/pickle.py", line 224, in dump
    self.save(obj)
  File "/usr/lib/python2.7/pickle.py", line 286, in save
    f(self, obj) # Call unbound method with explicit self
  File "/usr/lib/python2.7/pickle.py", line 655, in save_dict
    self._batch_setitems(obj.iteritems())
  File "/usr/lib/python2.7/pickle.py", line 669, in _batch_setitems
    save(v)
  File "/usr/lib/python2.7/pickle.py", line 286, in save
    f(self, obj) # Call unbound method with explicit self
  File "/usr/lib/python2.7/pickle.py", line 655, in save_dict
    self._batch_setitems(obj.iteritems())
  File "/usr/lib/python2.7/pickle.py", line 669, in _batch_setitems
    save(v)
  File "/usr/lib/python2.7/pickle.py", line 286, in save
    f(self, obj) # Call unbound method with explicit self
  File "/usr/lib/python2.7/pickle.py", line 655, in save_dict
    self._batch_setitems(obj.iteritems())
  File "/usr/lib/python2.7/pickle.py", line 668, in _batch_setitems
    save(k)
  File "/usr/lib/python2.7/pickle.py", line 331, in save
    self.save_reduce(obj=obj, *rv)
  File "/usr/lib/python2.7/pickle.py", line 401, in save_reduce
    save(args)
  File "/usr/lib/python2.7/pickle.py", line 271, in save
    pid = self.persistent_id(obj)
  File "/usr/lib/python2.7/pickle.py", line 333, in persistent_id
    def persistent_id(self, obj):
  File "/home/tullsen/src/3rd/BOPC/source/absblk.py", line 851, in __sig_handler
    raise Exception("Alarm triggered after %d seconds" % ABSBLK_TIMEOUT)
Exception: Alarm triggered after 5 seconds

From a quick look at the code, it seems like the exception defined on these lines:

source/absblk.py:851:
source/absblk.py:852:
source/absblk.py:853:
source/absblk.py:854:

cannot be raised once we get to source/mark.py:421. But I must be mistaken because the following 'hack' does stop this from occurring:

diff --git a/source/mark.py b/source/mark.py
index 950150f..52f77f1 100755
--- a/source/mark.py
+++ b/source/mark.py
@@ -58,6 +58,7 @@ import pprint
 import math
 import re

+import signal


 # -------------------------------------------------------------------------------------------------
@@ -418,6 +419,8 @@ class mark( object ):
         for node, _ in nx.get_node_attributes(self.__cfg.graph,'fail').iteritems():
             fail.add(node.addr)

+        signal.alarm(0)
         try:
             output = open(filename + '.abs', 'wb')  # create the file
             pickle.dump(abstr, output, 0)           # pickle dictionary using
         protocol 0.

setup.sh incorrect

in setup.sh, these packages are incompatible:
pip install angr==7.8.9.26
pip install networkx==1.11

When I allow pip to install networkx==2.1, I get an error as follows:

WARNING | 2018-11-11 02:34:23,538 | angr.engines.successors | Exit state has over 256 possible solutions. Likely unconstrained; skipping. <BV64 global_c0001fb_492_64{UNINITIALIZED}>
WARNING | 2018-11-11 02:34:24,897 | angr.engines.successors | Exit state has over 256 possible solutions. Likely unconstrained; skipping. <BV64 global_c000205_499_64{UNINITIALIZED}>
[02:34:37,328] [+] CFG generated.
[02:34:37,328] [+] Normalizing CFG...
[02:34:37,970] [+] Done.
[02:34:37,986] [*] CFG has 27087 nodes and 49902 edges
[02:34:38,046] [+] Loading basic block abstractions from file...
Traceback (most recent call last):
File "./source/BOPC.py", line 512, in <module>
  abstract(mark, args.abstractions, args.binary)
File "./source/BOPC.py", line 333, in abstract
  mark.load_abstractions(filename)            # simply load the abstractions
File "/home/tullsen/src/3rd/BOPC/source/mark.py", line 464, in load_abstractions
  for node, attr in self.__cfg.graph.nodes_iter(data=True):
AttributeError: 'DiGraph' object has no attribute 'nodes_iter'

Which is surprising to me because 'nodes_iter' seems to have been dropped from NetworkX 1.X when moving to NetworkX 2.0.

Eager for the VM of BOPC

This is a great work. And my research is relevent to BOP. So I have tried to reproduce BOPC but I can't run it because of an error of angr. That means finally I failed. So I hope you can give me the way to get the virtual machine of BOPC mentioned in your slides? @balbassam

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.