Giter Site home page Giter Site logo

bslatkin / dpxdt Goto Github PK

View Code? Open in Web Editor NEW
1.4K 1.4K 124.0 9.05 MB

Make continuous deployment safe by comparing before and after webpage screenshots for each release. Depicted shows when any visual, perceptual differences are found. This is the ultimate, automated end-to-end test.

Home Page: https://dpxdt-test.appspot.com

License: Apache License 2.0

Python 73.20% Mako 0.10% HTML 17.16% JavaScript 7.65% Makefile 0.04% Shell 0.43% CSS 1.29% Batchfile 0.13%

dpxdt's People

Contributors

aschran avatar beausmith avatar bslatkin avatar cericoda avatar danvk avatar elsigh avatar feigner avatar g4b1nagy avatar gbritz avatar gentoo90 avatar hloedo000 avatar jbarratt avatar jeremyolliver avatar koddsson avatar loedolff avatar mhemesath avatar nicohaase avatar ryanch avatar shijir avatar siderakis avatar ygravrand avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

dpxdt's Issues

Issue with running the run_shell on ubuntu

Not sure if this is on you, or even something you can help with, but when I try to run the tool, I am getting the following error.

./run_shell.sh
Traceback (most recent call last):
File "", line 1, in
File "dpxdt/server/init.py", line 45, in
import api
File "dpxdt/server/api.py", line 86, in
from flask.exceptions import HTTPException
ImportError: No module named exceptions

quit()

A pip freeze shows...

Brlapi==0.5.6
Flask==0.10.1
Flask-Login==0.2.4
Flask-SQLAlchemy==0.16
Flask-WTF==0.8.3
GnuPGInterface==0.3.2
Jinja==1.2
Jinja2==2.7
Landscape-Client==12.05
Mako==0.5.0
MarkupSafe==0.18
PAM==0.4.2
PIL==1.1.7
SQLAlchemy==0.7.4
Twisted-Core==11.1.0
Twisted-Names==11.1.0
Twisted-Web==11.1.0
WTForms==1.0.4
Werkzeug==0.9.1
adium-theme-ubuntu==0.3.2
apt-xapian-index==0.44
apturl==0.5.1ubuntu3
argparse==1.2.1
chardet==2.0.1
command-not-found==0.2.44
configglue==1.0
debtagshw==0.1
defer==1.0.2
dirspec==3.0.0

FIXME: could not find svn URL in dependency_links for this package:

distribute==0.6.24dev-r0
duplicity==0.6.18
httplib2==0.7.2
itsdangerous==0.22
jockey==0.9.7
keyring==0.9.2
language-selector==0.1
launchpadlib==1.9.12
lazr.restfulclient==0.12.0
lazr.uri==1.0.3
louis==2.3.0
nvidia-common==0.0.0
oauth==1.0.1
onboard==0.97.1
oneconf==0.2.8.1
pexpect==2.3
piston-mini-client==0.7.2
protobuf==2.4.1
pyOpenSSL==0.12
pycrypto==2.4.1
pycups==1.9.61
pycurl==7.19.0
pyinotify==0.9.2
pyserial==2.5
pysmbc==1.0.13
python-apt==0.8.3ubuntu7.1
python-dateutil==1.5
python-debian==0.1.21ubuntu1
python-virtkey==0.60.0
pyxdg==0.19
reportlab==2.5
rhythmbox-ubuntuone==3.0.0
screen-resolution-extra==0.0.0
sessioninstaller==0.0.0
simplejson==2.3.2
software-center-aptd-plugins==0.0.0
system-service==0.1.6
ubuntuone-couch==0.3.0
ubuntuone-installer==3.0.2
ubuntuone-storage-protocol==3.0.2
ufw==0.31.1-1
unattended-upgrades==0.1
unity-lens-video==0.3.5
unity-scope-video-remote==0.3.5
usb-creator==0.2.23
wadllib==1.3.0
wsgiref==0.1.2
xdiagnose==2.5.3
xkit==0.0.0
zope.interface==3.6.1

When pdiff returns a non-zero return code, give up on the task

For errors like this:

/tmp/tmpqf9pPp/ref PNG 1047x2994 1047x2994+0+0 8-bit DirectClass 1.739MB 0.390u 0:00.450
/tmp/tmpqf9pPp/run PNG 1047x2933 1047x2933+0+0 8-bit DirectClass 1.771MB 0.290u 0:00.280
compare.im6: image widths or heights differ `/tmp/tmpqf9pPp/ref' @ error/compare.c/CompareImageCommand/962.

Add button to make build public

Make it easy for open source projects to let users see their builds. Make sure that public builds are read-only for non-owners.

Sometimes get deadlock when requesting a run

I tried to fix this with nested transactions (on the Runs and WorkQueue tables) but that made SQLite unhappy. Calling db.session.flush() doesn't work either. I think the cause is the lockmode='update' when the run is selected.

2013-07-26 14:51:48.722 /api/request_run 500 2228ms 0kb Python-urllib/2.7
ELIDED - - [26/Jul/2013:14:51:48 -0700] "POST /api/request_run HTTP/1.1" 500 337 - "Python-urllib/2.7" "dpxdt-test.appspot.com" ms=2228 cpu_ms=0 cpm_usd=0.000038 pending_ms=439 app_engine_release=1.8.2 instance=00c61b117cbd5b476db0c7e6e5ed26da0cfd7e14
D 2013-07-26 14:51:46.956
Authenticated as API key=u'ELIDED'
I 2013-07-26 14:51:47.310
Created run: build_id=ELIDED, release_name=u'ELIDED', release_number=14, run_name='creator_step_2_of_3_websat_image_choice'
D 2013-07-26 14:51:47.448
Upload already exists: artifact_id='1fa68fa780103bb717005ebb58da61491ffd1d43'
I 2013-07-26 14:51:47.548
Enqueueing capture task='14316:ddacaeb20863e337ca4b7b7a60fd998d3b80ba62', baseline=False
D 2013-07-26 14:51:47.819
Upload already exists: artifact_id='f409f62c47f836df867e4ab32eaf2bbf31da75da'
I 2013-07-26 14:51:48.217
Enqueueing capture task='14316:2f75fb9ae9b71f328108309cd51ebf0ab94742b8:baseline', baseline=True
E 2013-07-26 14:51:48.491
Exception on /api/request_run [POST]
Traceback (most recent call last):
  File "./lib/flask/app.py", line 1809, in wsgi_app
    response = self.full_dispatch_request()
  File "./lib/flask/app.py", line 1482, in full_dispatch_request
    rv = self.handle_user_exception(e)
  File "./lib/flask/app.py", line 1480, in full_dispatch_request
    rv = self.dispatch_request()
  File "./lib/flask/app.py", line 1466, in dispatch_request
    return self.view_functions[rule.endpoint](**req.view_args)
  File "/base/data/home/apps/s~dpxdt-test/07-21-r01.368952019609286338/dpxdt/server/auth.py", line 331, in wrapped
    return f(*args, **kwargs)
  File "/base/data/home/apps/s~dpxdt-test/07-21-r01.368952019609286338/dpxdt/server/api.py", line 342, in request_run
    db.session.commit()
  File "./lib/sqlalchemy/orm/scoping.py", line 149, in do
    return getattr(self.registry(), name)(*args, **kwargs)
  File "./lib/sqlalchemy/orm/session.py", line 721, in commit
    self.transaction.commit()
  File "./lib/sqlalchemy/orm/session.py", line 354, in commit
    self._prepare_impl()
  File "./lib/sqlalchemy/orm/session.py", line 334, in _prepare_impl
    self.session.flush()
  File "./lib/sqlalchemy/orm/session.py", line 1818, in flush
    self._flush(objects)
  File "./lib/sqlalchemy/orm/session.py", line 1936, in _flush
    transaction.rollback(_capture_exception=True)
  File "./lib/sqlalchemy/util/langhelpers.py", line 56, in __exit__
    compat.reraise(exc_type, exc_value, exc_tb)
  File "./lib/sqlalchemy/orm/session.py", line 1900, in _flush
    flush_context.execute()
  File "./lib/sqlalchemy/orm/unitofwork.py", line 372, in execute
    rec.execute(self)
  File "./lib/sqlalchemy/orm/unitofwork.py", line 525, in execute
    uow
  File "./lib/sqlalchemy/orm/persistence.py", line 64, in save_obj
    table, insert)
  File "./lib/sqlalchemy/orm/persistence.py", line 538, in _emit_insert_statements
    execute(statement, multiparams)
  File "./lib/sqlalchemy/engine/base.py", line 662, in execute
    params)
  File "./lib/sqlalchemy/engine/base.py", line 763, in _execute_clauseelement
    compiled_sql, distilled_params
  File "./lib/sqlalchemy/engine/base.py", line 876, in _execute_context
    context)
  File "./lib/sqlalchemy/engine/base.py", line 1020, in _handle_dbapi_exception
    exc_info
  File "./lib/sqlalchemy/util/compat.py", line 199, in raise_from_cause
    reraise(type(exception), exception, tb=exc_tb)
  File "./lib/sqlalchemy/engine/base.py", line 869, in _execute_context
    context)
  File "./lib/sqlalchemy/engine/default.py", line 326, in do_execute
    cursor.execute(statement, parameters)
  File "/base/data/home/runtimes/python27/python27_lib/versions/1/google/storage/speckle/python/api/rdbms.py", line 566, in execute
    self._DoExec(request)
  File "/base/data/home/runtimes/python27/python27_lib/versions/1/google/storage/speckle/python/api/rdbms.py", line 446, in _DoExec
    response = self._conn.MakeRequest('Exec', request)
  File "/base/data/home/runtimes/python27/python27_lib/versions/1/google/storage/speckle/python/api/rdbms.py", line 940, in MakeRequest
    response = self._MakeRetriableRequest(stub_method, request)
  File "/base/data/home/runtimes/python27/python27_lib/versions/1/google/storage/speckle/python/api/rdbms.py", line 969, in _MakeRetriableRequest
    raise _ToDbApiException(sql_exception)
OperationalError: (OperationalError) (1213L, u'Deadlock found when trying to get lock; try restarting transaction') 'INSERT INTO work_queue (task_id, queue_name, live, eta, source, created, finished, lease_attempts, last_lease, last_owner, heartbeat, heartbeat_number, payload, content_type) VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s)' ('14316:2f75fb9ae9b71f328108309cd51ebf0ab94742b8:baseline', 'capture', 1, datetime.datetime(2013, 7, 26, 21, 51, 48, 451540), 'request_run', datetime.datetime(2013, 7, 26, 21, 51, 48, 452970), None, 0, None, None, None, None, '{"build_id": ELIDED, "config_sha1sum": "f409f62c47f836df867e4ab32eaf2bbf31da75da", "baseline": true, "run_name": "ELIDED", "url": "ELIDED", "release_name": "ELIDED", "release_number": 14}', 'application/json')

Prevent double-reporting race conditions

Right now if a capture worker is reporting results, but then loses its lease because it's taking too long, it may overwrite other work from other capture workers. The capture worker should time itself out locally when it knows its time is up. Or the server should take a task ID as a parameter and use it to see if the caller still holds the lock for mutations.

PdiffItem timeout too short

Running perceptualdiff on two screenshots of http://www.timeanddate.com/ takes 2 minutes on a VM (running on a Xeon CPU @ 2.00GHz). This causes the diff task to never complete since the timeout is set to 30 seconds.

It looks like the instance running on https://dpxdt-test.appspot.com has the same problem since the diff in my build there (see build id=6 if you have access) is also not completing.

Here's what I changed on my local instance to make the diffs not time out (not sure if both changes were needed):


--- a/dpxdt/client/workers.py
+++ b/dpxdt/client/workers.py
@@ -275,7 +275,7 @@ class FetchThread(WorkerThread):
class ProcessItem(WorkItem):
"""Work item that is handled by running a subprocess."""

  • def init(self, log_path, timeout_seconds=30):
  • def init(self, log_path, timeout_seconds=300):
    """Initializer.

--- a/dpxdt/server/work_queue.py
+++ b/dpxdt/server/work_queue.py
@@ -320,7 +320,7 @@ def handle_lease(queue_name):
task = lease(
queue_name,
owner,

  •        request.form.get('timeout', 60, type=int))
    
  •        request.form.get('timeout', 300, type=int))
    
    except Error, e:
    return utils.jsonify_error(e)

README!

... where do I start, how do I boot it, how do I integrate it with my app? :-)

Data pending perpetually

Hi there. I get the data pending status perpetually after running the local server and running the test script for google/yahoo. My local phantomjs path is correct and the logs (in the UI) are empty. Any help is much appreciated. Thanks!

Do tools exist to configure tests (including JS) for dpxdt?

The sitediff tool seems to create a bunch of new runs (TEST run, not release run?) on the fly by crawling URLs. Is there a tool that can read a directory of tests or a yaml / json config file, instead of crawling a site? Is there any tool that can associate a given JS file with a test?

If these tools don't exist, would you be interested in a pull request with such a tool?


That's my main set of questions. I've also taken some time to write up my current understanding of dpxdt and a confusion re. request_run. Normally I'd edit it out of the issue but it may be useful, if you have the time to read it, in understanding my state of mind and the confusions a developer might have coming to the project.

I've just started to explore dpxdt and it looks very useful. I got the sitediff tool up and running and... wow, what a great resource!

Unfortunately, a lot of our app is heavily dependent on JS interactions. I see APIs for injecting JS in a release, but I would need to inject different JS for each test run. It looks like client/capture.js just runs one test at a time, accepting a bit of injected JS at the per-test level, but the API only accepts injectJs for request_run, which runs the whole set of tests for the whole release candidate(?).

[10 minutes of code-reading pass]

Now I see that SiteDiff makes a new RequestRunWorkflow, which calls /request_run, so maybe request_run is really just running a single test and not a whole release? That is counter to my understanding from the guide:

Requests a new run for a release candidate. Causes the API system to take screenshots and do pdiffs. When ref_url and ref_config are supplied, the system will run two sets of captures (one for the baseline, one for the new release) and then compare them.

Retry transactions for work_queue because of deadlock

Exceptions like this:

2013-08-05 10:44:54.045
Exception on /api/work_queue/run-pdiff/finish [POST]
Traceback (most recent call last):
File "./lib/flask/app.py", line 1809, in wsgi_app
response = self.full_dispatch_request()
File "./lib/flask/app.py", line 1482, in full_dispatch_request
rv = self.handle_user_exception(e)
File "./lib/flask/app.py", line 1480, in full_dispatch_request
rv = self.dispatch_request()
File "./lib/flask/app.py", line 1466, in dispatch_request
return self.view_functionsrule.endpoint
File "/base/data/home/apps/sdpxdt-test/08-04-r01.369284316178806127/dpxdt/server/auth.py", line 347, in wrapped
return f(_args, *_kwargs)
File "/base/data/home/apps/s
dpxdt-test/08-04-r01.369284316178806127/dpxdt/server/work_queue.py", line 394, in handle_finish
db.session.commit()
File "./lib/sqlalchemy/orm/scoping.py", line 149, in do
return getattr(self.registry(), name)(_args, *_kwargs)
File "./lib/sqlalchemy/orm/session.py", line 721, in commit
self.transaction.commit()
File "./lib/sqlalchemy/orm/session.py", line 354, in commit
self._prepare_impl()
File "./lib/sqlalchemy/orm/session.py", line 334, in _prepare_impl
self.session.flush()
File "./lib/sqlalchemy/orm/session.py", line 1818, in flush
self._flush(objects)
File "./lib/sqlalchemy/orm/session.py", line 1936, in _flush
transaction.rollback(_capture_exception=True)
File "./lib/sqlalchemy/util/langhelpers.py", line 56, in exit
compat.reraise(exc_type, exc_value, exc_tb)
File "./lib/sqlalchemy/orm/session.py", line 1900, in _flush
flush_context.execute()
File "./lib/sqlalchemy/orm/unitofwork.py", line 372, in execute
rec.execute(self)
File "./lib/sqlalchemy/orm/unitofwork.py", line 525, in execute
uow
File "./lib/sqlalchemy/orm/persistence.py", line 59, in save_obj
mapper, table, update)
File "./lib/sqlalchemy/orm/persistence.py", line 492, in _emit_update_statements
execute(statement, params)
File "./lib/sqlalchemy/engine/base.py", line 662, in execute
params)
File "./lib/sqlalchemy/engine/base.py", line 763, in _execute_clauseelement
compiled_sql, distilled_params
File "./lib/sqlalchemy/engine/base.py", line 876, in _execute_context
context)
File "./lib/sqlalchemy/engine/base.py", line 1020, in _handle_dbapi_exception
exc_info
File "./lib/sqlalchemy/util/compat.py", line 199, in raise_from_cause
reraise(type(exception), exception, tb=exc_tb)
File "./lib/sqlalchemy/engine/base.py", line 869, in _execute_context
context)
File "./lib/sqlalchemy/engine/default.py", line 326, in do_execute
cursor.execute(statement, parameters)
File "/base/data/home/runtimes/python27/python27_lib/versions/1/google/storage/speckle/python/api/rdbms.py", line 566, in execute
self._DoExec(request)
File "/base/data/home/runtimes/python27/python27_lib/versions/1/google/storage/speckle/python/api/rdbms.py", line 446, in _DoExec
response = self._conn.MakeRequest('Exec', request)
File "/base/data/home/runtimes/python27/python27_lib/versions/1/google/storage/speckle/python/api/rdbms.py", line 940, in MakeRequest
response = self._MakeRetriableRequest(stub_method, request)
File "/base/data/home/runtimes/python27/python27_lib/versions/1/google/storage/speckle/python/api/rdbms.py", line 969, in _MakeRetriableRequest
raise _ToDbApiException(sql_exception)
OperationalError: (OperationalError) (1213L, u'Deadlock found when trying to get lock; try restarting transaction') 'UPDATE work_queue SET live=%s, finished=%s WHERE work_queue.task_id = %s AND work_queue.queue_name = %s' (0, datetime.datetime(2013, 8, 5, 17, 44, 53, 933150), '14758:377aa165ec4ad58c417d2940c59f483419fa3134:377aa165ec4ad58c417d2940c59f483419fa3134', 'run-pdiff')

Run-pdiff Failed "coercing to Unicode"

Hello,
I've set up an instance to try to run dpxdt locally, it succeed to take the screenshots, and marked the release as a ref etc..

Now it is trying to run the diff, but it's failing.
Here is the error I got from /api/work_queue/run-pdiff

Task failed. TypeError: coercing to Unicode: need string or buffer, NoneType found

And from the server

ERROR queue_workers.py:157] Exception while processing work from queue_url='http://localhost:5000/api/work_queue/run-pdiff', task={u'last_lease': 1372419385, u'task_id': u'b729f0fde8e54fa1937c0271734a40ea', u'created': 1372355914, u'queue_name': u'run-pdiff', u'lease_attempts': 66, u'source': None, u'eta': 1372419445, u'content_type': u'application/json', u'payload': {u'build_id': 1, u'reference_sha1sum': u'3bcab30d66ef30a2652935ddb1884be23cf43217', u'release_number': 1, u'run_sha1sum': u'3bcab30d66ef30a2652935ddb1884be23cf43217', u'run_name': u'run_blog2', 'heartbeat': <function heartbeat at 0x7feaf4040668>, u'release_name': u'release3'}}
Traceback (most recent call last):
  File "/opt/dpxdt/dpxdt/client/queue_workers.py", line 154, in run
    yield local_queue_workflow(**payload)
  File "/opt/dpxdt/dpxdt/client/workers.py", line 558, in handle_item
    next_item = generator.throw(*item.error)
  File "/opt/dpxdt/dpxdt/client/queue_workers.py", line 234, in run
    diff_path, log_path, diff_success)
  File "/opt/dpxdt/dpxdt/client/workers.py", line 562, in handle_item
    next_item = generator.send(item)
  File "/opt/dpxdt/dpxdt/client/release_worker.py", line 350, in run
    if os.path.isfile(diff_path) and os.path.isfile(log_path):
  File "/usr/lib/python2.7/genericpath.py", line 29, in isfile
    st = os.stat(path)
TypeError: coercing to Unicode: need string or buffer, NoneType found

Thanks for your help!

I'm really looking forward to make it work, I'll be happy to write and add the quickstart guide as soon as I made it work

Allow uploaded screenshot artifacts to be used by /api/request_run

/api/request_run does not describe a way to bypass screenshot generation and just do pdiffs.

This is desirable since phantomjs only does webkit-based screenshotting.

api/upload allows uploading of artifacts, therefore could be used to upload externally generated screenshots. It returns a sha1sum identifying the uploaded artifact.

What's missing is a way to use the sha1sum in /api/request_run. This could achieved via new parameters screenshot and ref_screenshot, with a sha1sum as acceptable value.

MS Windows

Can depicted be ran on ms windows box? after following the steps, at step ./run_shell.sh in GitBash term window, i get the following error:

$ ./run_shell.sh
Traceback (most recent call last):
File "", line 1, in
File "dpxdt__init__.py", line 19, in
import gflags
File "c:\dpxdt\dependencies\lib\gflags.py", line 1
../python-gflags/gflags.py

any help on this will be very appreciated.

Use hot-keys to quickly scroll through images

One you're in the single run/artifact view, use j/k to go between runs, y to mark it as good, f to flip between before/diff/after, n to mark it as bad, u to go to the whole release, ? to show key commands.

Cancel previous release attempt's run when next attempt is added

When a new release attempt is cut for the same release name, we should go back and cancel all pending tasks for the previous release and not even do them, if it's in the "processing" state, because we know they're not going to finish or are bad. This will keep old tasks from old releases from gumming up the system.

Never generated Diff image

I am able to setup Depicted on my local machine.

Executed run_site_diff.sh utility but it never generated Diff image. What I got is 'Before' image and 'After' image with Diff image being empty. Always has '? Diff pending' . Do you see anything missing in my setup?

Generates log in Diff: compare: unrecognized option `-highlight-color'.

snapshot

diff_log

Failed to run pdiff

Hello,
I've set up an instance to try to run dpxdt locally, it succeed to take the screenshots, and marked the release as a ref etc..

Now it is trying to run the diff, but it's failing.

In the log I can find the pdiffthread failed.

DEBUG workers.py:134] WorkflowThread:140647352755968 processed item=client.queue_workers.HeartbeatWorkflow({error=None, args=('http://localhost:5000/api/work_queue/run-pdiff', u'b729f0fde8e54fa1937c0271734a40ea', 'Running per...), done=True, result=None, kwargs={}, root=False})
DEBUG workers.py:307] PdiffThread:140647361148672 item=client.pdiff_worker.PdiffItem({log_path='/tmp/tmpV8zBZX/log.txt', error=None, timeout_seconds=60, return_code=None, output_path='/tmp/tmpV8zBZX/diff.png', ref_path='/tmp/tmpV8zBZX/ref', run_path='/tmp/tmpV8zBZX/run'}) Running subprocess: ['compare', '-verbose', '-metric', 'RMSE', '-highlight-color', 'Red', '-compose', 'Src', '/tmp/tmpV8zBZX/ref', '/tmp/tmpV8zBZX/run', '/tmp/tmpV8zBZX/diff.png']
ERROR workers.py:316] PdiffThread:140647361148672 item=client.pdiff_worker.PdiffItem({log_path='/tmp/tmpV8zBZX/log.txt', error=None, timeout_seconds=60, return_code=None, output_path='/tmp/tmpV8zBZX/diff.png', ref_path='/tmp/tmpV8zBZX/ref', run_path='/tmp/tmpV8zBZX/run'}) Failed to run subprocess: ['compare', '-verbose', '-metric', 'RMSE', '-highlight-color', 'Red', '-compose', 'Src', '/tmp/tmpV8zBZX/ref', '/tmp/tmpV8zBZX/run', '/tmp/tmpV8zBZX/diff.png']
DEBUG workers.py:131] PdiffThread:140647361148672 error item=client.pdiff_worker.PdiffItem({log_path='/tmp/tmpV8zBZX/log.txt', error=(<type 'exceptions.OSError'>, OSError(2, 'No such file or directory'), <traceback object at 0x3844b4...), timeout_seconds=60, return_code=None, output_path='/tmp/tmpV8zBZX/diff.png', ref_path='/tmp/tmpV8zBZX/ref', run_path='/tmp/tmpV8zBZX/run'})
DEBUG workers.py:554] Transitioning workflow=client.queue_workers.DoPdiffQueueWorkflow({error=None, args=(), done=False, result=None, kwargs={build_id=1, reference_sha1sum=u'3bcab30d66ef30a2652935ddb1884be23cf43217', release_number=1, run_sha1sum=u'3bcab30d66ef30a2652935ddb1884be23cf43217', run_name=u'run_blog2', heartbeat=<function heartbeat at 0x383b848>, release_name=u'release3'}, root=False}), generator=<generator object run at 0x3834050>, item=client.pdiff_worker.PdiffItem({log_path='/tmp/tmpV8zBZX/log.txt', error=(<type 'exceptions.OSError'>, OSError(2, 'No such file or directory'), <traceback object at 0x3844b4...), timeout_seconds=60, return_code=None, output_path='/tmp/tmpV8zBZX/diff.png', ref_path='/tmp/tmpV8zBZX/ref', run_path='/tmp/tmpV8zBZX/run'})
DEBUG workers.py:134] WorkflowThread:140647352755968 processed item=client.pdiff_worker.PdiffItem({log_path='/tmp/tmpV8zBZX/log.txt', error=(<type 'exceptions.OSError'>, OSError(2, 'No such file or directory'), <traceback object at 0x3844b4...), timeout_seconds=60, return_code=None, output_path='/tmp/tmpV8zBZX/diff.png', ref_path='/tmp/tmpV8zBZX/ref', run_path='/tmp/tmpV8zBZX/run'})

I guess the "No such file or directory" error is there because it failed to generate the /tmp/tmpV8zBZX/diff.png. All the other files are there :

file /tmp/tmpV8zBZX/log.txt
/tmp/tmpV8zBZX/log.txt: empty
file /tmp/tmpV8zBZX/ref
/tmp/tmpV8zBZX/ref: PNG image data, 980 x 10406, 8-bit/color RGBA, non-interlaced
file /tmp/tmpV8zBZX/run
/tmp/tmpV8zBZX/run: PNG image data, 980 x 10406, 8-bit/color RGBA, non-interlaced
file /tmp/tmpV8zBZX/diff.png
/tmp/tmpV8zBZX/diff.png: ERROR: cannot open `/tmp/tmpV8zBZX/diff.png' (No such file or directory)

/tmp/tmpV8zBZX/log.txt is empty so I don;t have more information

I'm really stuck now, and I'm not good enough to understand how to run this pdiff manually => Failed to run subprocess: ['compare', '-verbose', '-metric', 'RMSE', '-highlight-color', 'Red', '-compose', 'Src', '/tmp/tmpV8zBZX/ref', '/tmp/tmpV8zBZX/run', '/tmp/tmpV8zBZX/diff.png']

Thanks for your help!

I'm really looking forward to make it work, I'll be happy to write and add the quickstart guide as soon as I made it work

No module named dpxdt

Hi, when running ./site_diff.py (with the relevant flags) I get:

Traceback (most recent call last):
  File "./site_diff.py", line 35, in <module>
    import dpxdt
ImportError: No module named dpxdt

Time taken to run a test

I have submitted a sample test. and their status is showing as processing for more than 40 minutes. ?

how can i check debug this ?

Uncaught 'TimerItem' object has no attribute 'json' exception sometimes stops Depicted

After running for some time, our depicted installation stops throwing the following exception:

INFO queue_worker.py:205] Fetching 1 tasks from queue_url='http://localhost:5000/api/work_queue/capture' for workflow=<class 'dpxdt.client.capture_worker.DoCaptureQueueWorkflow'>
INFO _internal.py:119] 127.0.0.1 - - [31/Oct/2013 05:58:42] "POST /api/work_queue/capture/lease HTTP/1.1" 200 -
ERROR workers.py:436] WorkflowThread:-1244660928 error item=dpxdt.client.timer_worker.TimerItem({ready_time: 1383195523.093645, delay_seconds: 2, done: True})
Traceback (most recent call last):
File "/home/ubuntu/dpxdt/dpxdt/client/workers.py", line 428, in handle_item
next_item = generator.send(item)
File "/home/ubuntu/dpxdt/dpxdt/client/queue_worker.py", line 217, in run
if next_item.json:
AttributeError: 'TimerItem' object has no attribute 'json'

Not able to run this on a ubuntu machine in amazon

  • I have ubuntu machine in amazon aws where I installed all the deps(phantomjs and imagekick) and checked out the code.
  • I can request the run and even see that it's in pending state but no screenshots are captured
  • I can manually capture the screenshot using phantomjs binary. So I don't think its a network issue
    Command for that : phantomjs examples/rasterize.js http://www.yummly.com temp.png
  • I am new to python so don't understand it well yet. I was wondering if you have seen this issue before.

Thanks,
Pritesh

Here is the log of the run_combined.sh script :

+1: INFO _internal.py:119] 10.190.221.126 - - [09/Oct/2013 18:38:01] "POST /api/work_queue/run-pdiff/lease HTTP/1.1" 200 -
INFO api.py:141] Created release: build_id=1, release_name=u'WEBSITE 10/09/2013', url=None, release_number=4
INFO _internal.py:119] 76.102.14.120 - - [09/Oct/2013 18:38:01] "POST /api/create_release HTTP/1.1" 200 -
INFO api.py:270] Created run: build_id=1, release_name=u'WEBSITE 10/09/2013', release_number=4, run_name='recipe-source-page'
INFO api.py:303] Enqueueing capture task='6:d966861415cd9c212a2e761932d56cc17c3f67b7', baseline=False
INFO api.py:303] Enqueueing capture task='6:d966861415cd9c212a2e761932d56cc17c3f67b7:baseline', baseline=True
INFO _internal.py:119] 76.102.14.120 - - [09/Oct/2013 18:38:02] "POST /api/request_run HTTP/1.1" 200 -
INFO queue_worker.py:205] Fetching 1 tasks from queue_url='http://builder1.jenkins.yummly.com:5000/api/work_queue/capture' for workflow=<class 'dpxdt.client.capture_worker.DoCaptureQueueWorkflow'>
INFO _internal.py:119] 10.190.221.126 - - [09/Oct/2013 18:38:02] "POST /api/work_queue/capture/lease HTTP/1.1" 200 -
INFO queue_worker.py:115] Starting work item from queue_url='http://builder1.jenkins.yummly.com:5000/api/work_queue/capture', task={u'last_lease': 1381343882, u'task_id': u'6:d966861415cd9c212a2e761932d56cc17c3f67b7', u'created': 1381343881, u'queue_name': u'capture', u'lease_attempts': 1, u'source': u'request_run', u'eta': 1381343942, u'content_type': u'application/json', u'payload': {u'build_id': 1, u'config_sha1sum': u'44b8f87c6c40bd07579de8a884713a56b610a378', u'url': u'http://www.yummly.com/page/the-pioneer-woman', u'release_number': 4, u'run_name': u'recipe-source-page', u'release_name': u'WEBSITE 10/09/2013', u'baseline': False}}, workflow=<class 'dpxdt.client.capture_worker.DoCaptureQueueWorkflow'>, wait_seconds=0
INFO _internal.py:119] 10.190.221.126 - - [09/Oct/2013 18:38:03] "POST /api/work_queue/capture/heartbeat HTTP/1.1" 200 -
INFO _internal.py:119] 10.190.221.126 - - [09/Oct/2013 18:38:06] "GET /api/download?sha1sum=44b8f87c6c40bd07579de8a884713a56b610a378&build_id=1 HTTP/1.1" 200 -
INFO process_worker.py:58] item=dpxdt.client.capture_worker.CaptureWorkflow({args: ('/tmp/tmpWUfngf/log.txt',), output_path: '/tmp/tmpWUfngf/capture.png', config_path: '/tmp/tmpWUfngf/config.json', kwargs: {timeout_seconds: 20}}) Running subprocess: ['/usr/bin/phantomjs', '--disk-cache=no', '--debug=yes', '--ignore-ssl-errors=yes', 'dpxdt/client/capture.js', '/tmp/tmpWUfngf/config.json', '/tmp/tmpWUfngf/capture.png']
INFO _internal.py:119] 10.190.221.126 - - [09/Oct/2013 18:38:06] "POST /api/work_queue/capture/heartbeat HTTP/1.1" 200 -
INFO queue_worker.py:205] Fetching 1 tasks from queue_url='http://builder1.jenkins.yummly.com:5000/api/work_queue/capture' for workflow=<class 'dpxdt.client.capture_worker.DoCaptureQueueWorkflow'>
INFO _internal.py:119] 10.190.221.126 - - [09/Oct/2013 18:38:09] "POST /api/work_queue/capture/lease HTTP/1.1" 200 -
INFO queue_worker.py:205] Fetching 1 tasks from queue_url='http://builder1.jenkins.yummly.com:5000/api/work_queue/run-pdiff' for workflow=<class 'dpxdt.client.pdiff_worker.DoPdiffQueueWorkflow'>
INFO queue_worker.py:115] Starting work item from queue_url='http://builder1.jenkins.yummly.com:5000/api/work_queue/capture', task={u'last_lease': 1381343889, u'task_id': u'6:d966861415cd9c212a2e761932d56cc17c3f67b7:baseline', u'created': 1381343881, u'queue_name': u'capture', u'lease_attempts': 1, u'source': u'request_run', u'eta': 1381343949, u'content_type': u'application/json', u'payload': {u'build_id': 1, u'config_sha1sum': u'44b8f87c6c40bd07579de8a884713a56b610a378', u'url': u'http://www.yummly.com/page/the-pioneer-woman', u'release_number': 4, u'run_name': u'recipe-source-page', u'release_name': u'WEBSITE 10/09/2013', u'baseline': True}}, workflow=<class 'dpxdt.client.capture_worker.DoCaptureQueueWorkflow'>, wait_seconds=0
INFO _internal.py:119] 10.190.221.126 - - [09/Oct/2013 18:38:10] "POST /api/work_queue/run-pdiff/lease HTTP/1.1" 200 -
INFO _internal.py:119] 10.190.221.126 - - [09/Oct/2013 18:38:11] "POST /api/work_queue/capture/heartbeat HTTP/1.1" 200 -
INFO queue_worker.py:205] Fetching 1 tasks from queue_url='http://builder1.jenkins.yummly.com:5000/api/work_queue/run-pdiff' for workflow=<class 'dpxdt.client.pdiff_worker.DoPdiffQueueWorkflow'>
INFO _internal.py:119] 10.190.221.126 - - [09/Oct/2013 18:38:12] "POST /api/work_queue/run-pdiff/lease HTTP/1.1" 200 -
INFO _internal.py:119] 10.190.221.126 - - [09/Oct/2013 18:38:14] "GET /api/download?sha1sum=44b8f87c6c40bd07579de8a884713a56b610a378&build_id=1 HTTP/1.1" 200 -
INFO process_worker.py:58] item=dpxdt.client.capture_worker.CaptureWorkflow({args: ('/tmp/tmpgY7Umn/log.txt',), output_path: '/tmp/tmpgY7Umn/capture.png', config_path: '/tmp/tmpgY7Umn/config.json', kwargs: {timeout_seconds: 20}}) Running subprocess: ['/usr/bin/phantomjs', '--disk-cache=no', '--debug=yes', '--ignore-ssl-errors=yes', 'dpxdt/client/capture.js', '/tmp/tmpgY7Umn/config.json', '/tmp/tmpgY7Umn/capture.png']
INFO _internal.py:119] 10.190.221.126 - - [09/Oct/2013 18:38:14] "POST /api/work_queue/capture/heartbeat HTTP/1.1" 200 -
INFO queue_worker.py:205] Fetching 1 tasks from queue_url='http://builder1.jenkins.yummly.com:5000/api/work_queue/run-pdiff' for workflow=<class 'dpxdt.client.pdiff_worker.DoPdiffQueueWorkflow'>
INFO _internal.py:119] 10.190.221.126 - - [09/Oct/2013 18:38:21] "POST /api/work_queue/run-pdiff/lease HTTP/1.1" 200 -
INFO queue_worker.py:205] Fetching 1 tasks from queue_url='http://builder1.jenkins.yummly.com:5000/api/work_queue/run-pdiff' for workflow=<class 'dpxdt.client.pdiff_worker.DoPdiffQueueWorkflow'>
INFO _internal.py:119] 10.190.221.126 - - [09/Oct/2013 18:38:23] "POST /api/work_queue/run-pdiff/lease HTTP/1.1" 200 -
ERROR workers.py:436] WorkflowThread:140359827281664 error item=dpxdt.client.timer_worker.TimerItem({ready_time: 1381343906.889217, delay_seconds: 1.0, done: True})
Traceback (most recent call last):
File "/home/ubuntu/pageCompare/dpxdt/client/workers.py", line 428, in handle_item
next_item = generator.send(item)
File "/home/ubuntu/pageCompare/dpxdt/client/process_worker.py", line 79, in run
(self, process.pid, run_time))
TimeoutError: Sent SIGKILL to item=dpxdt.client.capture_worker.CaptureWorkflow({args: ('/tmp/tmpWUfngf/log.txt',), output_path: '/tmp/tmpWUfngf/capture.png', config_path: '/tmp/tmpWUfngf/config.json', kwargs: {timeout_seconds: 20}}), pid=20508, run_time=20.5722310543
INFO _internal.py:119] 10.190.221.126 - - [09/Oct/2013 18:38:27] "POST /api/work_queue/capture/heartbeat HTTP/1.1" 200 -
INFO api.py:560] Upload received: artifact_id='c5695a75755a1840c14115a3314fa6e25b904f41', content_type='text/plain'
INFO _internal.py:119] 10.190.221.126 - - [09/Oct/2013 18:38:28] "POST /api/upload HTTP/1.1" 200 -
INFO api.py:421] Saved run data: build_id=1, release_name=u'WEBSITE 10/09/2013', release_number=4, run_name=u'recipe-source-page', url=u'http://www.yummly.com/page/the-pioneer-woman', image=None, log='c5695a75755a1840c14115a3314fa6e25b904f41', config=u'44b8f87c6c40bd07579de8a884713a56b610a378'
INFO api.py:156] Release not in processing state yet: build_id=1, name=u'WEBSITE 10/09/2013', number=4
INFO api.py:499] Updated run: build_id=1, release_name=u'WEBSITE 10/09/2013', release_number=4, run_name=u'recipe-source-page', status=u'data_pending'

Work queue deleting a non-existent task raises exception

Exception on /api/work_queue/run-pdiff/lease [POST]
Traceback (most recent call last):
File "./lib/flask/app.py", line 1809, in wsgi_app
response = self.full_dispatch_request()
File "./lib/flask/app.py", line 1482, in full_dispatch_request
rv = self.handle_user_exception(e)
File "./lib/flask/app.py", line 1480, in full_dispatch_request
rv = self.dispatch_request()
File "./lib/flask/app.py", line 1466, in dispatch_request
return self.view_functionsrule.endpoint
File "/base/data/home/apps/sdpxdt-test/06-18-r02.368193824512065913/dpxdt/server/auth.py", line 307, in wrapped
return f(_args, *_kwargs)
File "/base/data/home/apps/s
dpxdt-test/06-18-r02.368193824512065913/dpxdt/server/work_queue.py", line 333, in handle_lease
db.session.commit()
File "./lib/sqlalchemy/orm/scoping.py", line 149, in do
return getattr(self.registry(), name)(_args, *_kwargs)
File "./lib/sqlalchemy/orm/session.py", line 721, in commit
self.transaction.commit()
File "./lib/sqlalchemy/orm/session.py", line 354, in commit
self._prepare_impl()
File "./lib/sqlalchemy/orm/session.py", line 334, in _prepare_impl
self.session.flush()
File "./lib/sqlalchemy/orm/session.py", line 1818, in flush
self._flush(objects)
File "./lib/sqlalchemy/orm/session.py", line 1936, in _flush
transaction.rollback(_capture_exception=True)
File "./lib/sqlalchemy/util/langhelpers.py", line 56, in exit
compat.reraise(exc_type, exc_value, exc_tb)
File "./lib/sqlalchemy/orm/session.py", line 1900, in _flush
flush_context.execute()
File "./lib/sqlalchemy/orm/unitofwork.py", line 372, in execute
rec.execute(self)
File "./lib/sqlalchemy/orm/unitofwork.py", line 525, in execute
uow
File "./lib/sqlalchemy/orm/persistence.py", line 59, in save_obj
mapper, table, update)
File "./lib/sqlalchemy/orm/persistence.py", line 511, in _emit_update_statements
(table.description, len(update), rows))
StaleDataError: UPDATE statement on table 'work_queue' expected to update 1 row(s); 0 were matched.

UnicodeDecodeError: 'ascii' codec can't decode byte 0xe5

LiuRan-GhostCoder:dpxdt twer$ ./run_site_diff.sh \

--upload_build_id=1 \
--crawl_depth=1 \
http://www.bing.com

Scanning for content
Scanning 1 pages for good urls

Found 15 new URLs from http://www.bing.com/
Finished crawl at depth 0
Scanning 15 pages for good urls
Found 7 new URLs from http://www.bing.com/t.indexOf(
Found 9 new URLs from http://www.bing.com/http:\/\/cn.bing.com\/
Found 13 new URLs from http://www.bing.com/yxl
Found 9 new URLs from http://www.bing.com/images
Found 9 new URLs from http://www.bing.com/videos
Found 12 new URLs from http://www.bing.com/explore
Found 0 new URLs from http://www.bing.com/search
Found 12 new URLs from http://www.bing.com/news
Found 21 new URLs from http://www.bing.com/ditu/
ERROR:root:WorkflowThread:4340813824 error item=main.PrintWorkflow({args: ('Found 21 new URLs from http://www.bing.com/ditu/',), done: True, kwargs: {}})
Traceback (most recent call last):
File "/Users/twer/work/perceptual/dpxdt/dpxdt/client/workers.py", line 426, in handle_item
next_item = generator.send(item.result)
File "./dpxdt/tools/site_diff.py", line 231, in run
found = extract_urls(item.url, item.data)
File "./dpxdt/tools/site_diff.py", line 130, in extract_urls
data = re.sub(pattern, fixed, data)
File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/re.py", line 151, in sub
return _compile(pattern, flags).sub(repl, string, count)
UnicodeDecodeError: 'ascii' codec can't decode byte 0xe5 in position 11594: ordinal not in range(128)
Traceback (most recent call last):
File "./dpxdt/tools/site_diff.py", line 338, in
main(sys.argv)
File "./dpxdt/tools/site_diff.py", line 334, in main
upload_release_name=FLAGS.upload_release_name)
File "./dpxdt/tools/site_diff.py", line 310, in real_main
coordinator.wait_one()
File "/Users/twer/work/perceptual/dpxdt/dpxdt/client/workers.py", line 342, in wait_one
item.check_result()
File "/Users/twer/work/perceptual/dpxdt/dpxdt/client/workers.py", line 426, in handle_item
next_item = generator.send(item.result)
File "./dpxdt/tools/site_diff.py", line 231, in run
found = extract_urls(item.url, item.data)
File "./dpxdt/tools/site_diff.py", line 130, in extract_urls
data = re.sub(pattern, fixed, data)
File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/re.py", line 151, in sub
return _compile(pattern, flags).sub(repl, string, count)
UnicodeDecodeError: 'ascii' codec can't decode byte 0xe5 in position 11594: ordinal not in range(128)

pypi package? egg?

Hey Brett -- I would love to make using Depicted easier for my developers. Are you planning on packaging this up? If you're not, I would like to volunteer to do that and if needed be the one who maintains the package. Let me know

phantom.js hangs on pages without javascript

Hello,
I've noticed any capture will hangs and therefore fails if there is no javascript on a page, here is an example :

cat /tmp/phantom_fail.js
{"targetUrl": "http://www.apache.org/server-status"}
/usr/local/bin/phantomjs --disk-cache=false --debug=true dpxdt/client/capture.js /tmp/phantom_fail.js /tmp/capture.png
2013-06-28T16:50:16 [DEBUG] CookieJar - Created but will not store cookies (use option '--cookies-file=<filename>' to enable persisten cookie storage)
2013-06-28T16:50:16 [DEBUG] Phantom - execute: Configuration
2013-06-28T16:50:16 [DEBUG]      0 objectName : ""
2013-06-28T16:50:16 [DEBUG]      1 cookiesFile : ""
2013-06-28T16:50:16 [DEBUG]      2 diskCacheEnabled : "false"
2013-06-28T16:50:16 [DEBUG]      3 maxDiskCacheSize : "-1"
2013-06-28T16:50:16 [DEBUG]      4 ignoreSslErrors : "false"
2013-06-28T16:50:16 [DEBUG]      5 localToRemoteUrlAccessEnabled : "false"
2013-06-28T16:50:16 [DEBUG]      6 outputEncoding : "UTF-8"
2013-06-28T16:50:16 [DEBUG]      7 proxyType : "http"
2013-06-28T16:50:16 [DEBUG]      8 proxy : ":8"
2013-06-28T16:50:16 [DEBUG]      9 proxyAuth : ":"
2013-06-28T16:50:16 [DEBUG]      10 scriptEncoding : "UTF-8"
2013-06-28T16:50:16 [DEBUG]      11 webSecurityEnabled : "true"
2013-06-28T16:50:16 [DEBUG]      12 offlineStoragePath : ""
2013-06-28T16:50:16 [DEBUG]      13 offlineStorageDefaultQuota : "-1"
2013-06-28T16:50:16 [DEBUG]      14 printDebugMessages : "true"
2013-06-28T16:50:16 [DEBUG]      15 javascriptCanOpenWindows : "true"
2013-06-28T16:50:16 [DEBUG]      16 javascriptCanCloseWindows : "true"
2013-06-28T16:50:16 [DEBUG]      17 sslProtocol : "sslv3"
2013-06-28T16:50:16 [DEBUG]      18 sslCertificatesPath : ""
2013-06-28T16:50:16 [DEBUG]      19 webdriver : ":"
2013-06-28T16:50:16 [DEBUG]      20 webdriverLogFile : ""
2013-06-28T16:50:16 [DEBUG]      21 webdriverLogLevel : "INFO"
2013-06-28T16:50:16 [DEBUG]      22 webdriverSeleniumGridHub : ""
2013-06-28T16:50:16 [DEBUG] Phantom - execute: Script & Arguments
2013-06-28T16:50:16 [DEBUG]      script: "dpxdt/client/capture.js"
2013-06-28T16:50:16 [DEBUG]      0 arg: "/tmp/phantom_fail.js"
2013-06-28T16:50:16 [DEBUG]      1 arg: "/tmp/capture.png"
2013-06-28T16:50:16 [DEBUG] Phantom - execute: Starting normal mode
2013-06-28T16:50:16 [DEBUG] WebPage - setupFrame ""
2013-06-28T16:50:16 [DEBUG] FileSystem - _open: ":/modules/fs.js" QMap(("mode", QVariant(QString, "r") ) )
2013-06-28T16:50:16 [DEBUG] FileSystem - _open: ":/modules/system.js" QMap(("mode", QVariant(QString, "r") ) )
2013-06-28T16:50:16 [DEBUG] FileSystem - _open: ":/modules/_coffee-script.js" QMap(("mode", QVariant(QString, "r") ) )
2013-06-28T16:50:16 [DEBUG] FileSystem - _open: ":/modules/../coffee-script/package.json" QMap(("mode", QVariant(QString, "r") ) )
2013-06-28T16:50:16 [DEBUG] FileSystem - _open: ":/modules/../coffee-script/./lib/coffee-script/coffee-script.js" QMap(("mode", QVariant(QString, "r") ) )
2013-06-28T16:50:16 [DEBUG] FileSystem - _open: ":/modules/../coffee-script/./lib/coffee-script/./lexer.js" QMap(("mode", QVariant(QString, "r") ) )
2013-06-28T16:50:16 [DEBUG] FileSystem - _open: ":/modules/../coffee-script/./lib/coffee-script/././rewriter.js" QMap(("mode", QVariant(QString, "r") ) )
2013-06-28T16:50:16 [DEBUG] FileSystem - _open: ":/modules/../coffee-script/./lib/coffee-script/././helpers.js" QMap(("mode", QVariant(QString, "r") ) )
2013-06-28T16:50:16 [DEBUG] FileSystem - _open: ":/modules/../coffee-script/./lib/coffee-script/./parser.js" QMap(("mode", QVariant(QString, "r") ) )
2013-06-28T16:50:16 [DEBUG] FileSystem - _open: ":/modules/../coffee-script/./lib/coffee-script/./helpers.js" QMap(("mode", QVariant(QString, "r") ) )
2013-06-28T16:50:16 [DEBUG] FileSystem - _open: ":/modules/../coffee-script/./lib/coffee-script/./nodes.js" QMap(("mode", QVariant(QString, "r") ) )
2013-06-28T16:50:16 [DEBUG] FileSystem - _open: ":/modules/../coffee-script/./lib/coffee-script/././scope.js" QMap(("mode", QVariant(QString, "r") ) )
2013-06-28T16:50:16 [DEBUG] FileSystem - _open: ":/modules/../coffee-script/./lib/coffee-script/./././helpers.js" QMap(("mode", QVariant(QString, "r") ) )
2013-06-28T16:50:16 [DEBUG] FileSystem - _open: ":/modules/../coffee-script/./lib/coffee-script/././lexer.js" QMap(("mode", QVariant(QString, "r") ) )
2013-06-28T16:50:16 [DEBUG] FileSystem - _open: ":/modules/../coffee-script/./lib/coffee-script/./././rewriter.js" QMap(("mode", QVariant(QString, "r") ) )
2013-06-28T16:50:16 [DEBUG] FileSystem - _open: ":/modules/webpage.js" QMap(("mode", QVariant(QString, "r") ) )
2013-06-28T16:50:16 [DEBUG] FileSystem - _open: "/tmp/phantom_fail.js" QMap(("mode", QVariant(QString, "r") ) )
2013-06-28T16:50:16 [DEBUG] WebpageCallbacks - getGenericCallback
2013-06-28T16:50:16 [DEBUG] WebPage - updateLoadingProgress: 10
2013-06-28T16:50:17 [DEBUG] WebPage - updateLoadingProgress: 30
Loaded: http://www.apache.org/server-status
2013-06-28T16:50:18 [DEBUG] WebPage - updateLoadingProgress: 100
Finished loading page: http://www.apache.org/server-status
2013-06-28T16:50:18 [DEBUG] WebPage - setupFrame ""
2013-06-28T16:50:18 [DEBUG] WebPage - setupFrame ""
2013-06-28T16:50:18 [DEBUG] WebPage - evaluateJavaScript "(function() { return (function () {
        document.addEventListener('DOMContentLoaded', function() {
            window.callPhantom('DOMContentLoaded');
        }, false);
    })(); })()"
2013-06-28T16:50:18 [DEBUG] WebPage - evaluateJavaScript result QVariant(, )

Is it an expected behaviour?

Thank you

Make the artifact view full screen on small displays

When the screen height is smaller than the image height, collapse the header bar and make the whole UI full screen. Could also make this an option if people click a button or something. Mousing over the top would roll out the header like a drawer.

no module named flask

Trying to run the simple ./run_combined.sh script (which worked on Friday), I now get the following error

$ ./run_combined.sh
Traceback (most recent call last):
File "./dpxdt/runserver.py", line 29, in
from dpxdt import server
File "/Users/dborin/git/depicted/dpxdt/server/init.py", line 23, in
from flask import Flask, url_for
ImportError: No module named flask

Nothing in the README indicates I need to install Flask. Is this now a requirement?

Unable to take screenshots of pages requiring SSL

I've got dpxdt running locally and I'm loving the possibilities, but I need it to be able to access sites that require self-signed SSL such as staging or local dev env.

Comparing google.com and yahoo.com works as expected:

./run_url_pair_diff.sh --upload_build_id=1 http://google.com http://yahoo.com

I want to compare between staging, production, and local dev with a command like:

./run_url_pair_diff.sh --upload_build_id=2 https://broadway.squareup.com/market/sightglass-coffee https://squareup.com/market/sightglass-coffee

I have manually created certs for staging and local dev environments, how can I give dpxdt access to these?

Here's the PhantomJS Log showing the failed screenshot when attempting to access the staging server:

2013-07-23T15:38:43 [DEBUG] CookieJar - Created but will not store cookies (use option '--cookies-file=<filename>' to enable persisten cookie storage) 
2013-07-23T15:38:43 [DEBUG] Phantom - execute: Configuration 
2013-07-23T15:38:43 [DEBUG]      0 objectName : "" 
2013-07-23T15:38:43 [DEBUG]      1 cookiesFile : "" 
2013-07-23T15:38:43 [DEBUG]      2 diskCacheEnabled : "false" 
2013-07-23T15:38:43 [DEBUG]      3 maxDiskCacheSize : "-1" 
2013-07-23T15:38:43 [DEBUG]      4 ignoreSslErrors : "false" 
2013-07-23T15:38:43 [DEBUG]      5 localToRemoteUrlAccessEnabled : "false" 
2013-07-23T15:38:43 [DEBUG]      6 outputEncoding : "UTF-8" 
2013-07-23T15:38:43 [DEBUG]      7 proxyType : "http" 
2013-07-23T15:38:43 [DEBUG]      8 proxy : ":1080" 
2013-07-23T15:38:43 [DEBUG]      9 proxyAuth : ":" 
2013-07-23T15:38:43 [DEBUG]      10 scriptEncoding : "UTF-8" 
2013-07-23T15:38:43 [DEBUG]      11 webSecurityEnabled : "true" 
2013-07-23T15:38:43 [DEBUG]      12 offlineStoragePath : "" 
2013-07-23T15:38:43 [DEBUG]      13 offlineStorageDefaultQuota : "-1" 
2013-07-23T15:38:43 [DEBUG]      14 printDebugMessages : "true" 
2013-07-23T15:38:43 [DEBUG]      15 javascriptCanOpenWindows : "true" 
2013-07-23T15:38:43 [DEBUG]      16 javascriptCanCloseWindows : "true" 
2013-07-23T15:38:43 [DEBUG]      17 sslProtocol : "sslv3" 
2013-07-23T15:38:43 [DEBUG]      18 sslCertificatesPath : "" 
2013-07-23T15:38:43 [DEBUG]      19 webdriver : ":" 
2013-07-23T15:38:43 [DEBUG]      20 webdriverLogFile : "" 
2013-07-23T15:38:43 [DEBUG]      21 webdriverLogLevel : "INFO" 
2013-07-23T15:38:43 [DEBUG]      22 webdriverSeleniumGridHub : "" 
2013-07-23T15:38:43 [DEBUG] Phantom - execute: Script & Arguments 
2013-07-23T15:38:43 [DEBUG]      script: "dpxdt/client/capture.js" 
2013-07-23T15:38:43 [DEBUG]      0 arg: "/var/folders/5x/yw20597s0s3bql4dj98c8fzc0000zt/T/tmpiZAres/config.json" 
2013-07-23T15:38:43 [DEBUG]      1 arg: "/var/folders/5x/yw20597s0s3bql4dj98c8fzc0000zt/T/tmpiZAres/capture.png" 
2013-07-23T15:38:43 [DEBUG] Phantom - execute: Starting normal mode 
2013-07-23T15:38:43 [DEBUG] WebPage - setupFrame "" 
2013-07-23T15:38:43 [DEBUG] FileSystem - _open: ":/modules/fs.js" QMap(("mode", QVariant(QString, "r") ) )  
2013-07-23T15:38:43 [DEBUG] FileSystem - _open: ":/modules/system.js" QMap(("mode", QVariant(QString, "r") ) )  
2013-07-23T15:38:43 [DEBUG] FileSystem - _open: ":/modules/_coffee-script.js" QMap(("mode", QVariant(QString, "r") ) )  
2013-07-23T15:38:43 [DEBUG] FileSystem - _open: ":/modules/../coffee-script/package.json" QMap(("mode", QVariant(QString, "r") ) )  
2013-07-23T15:38:43 [DEBUG] FileSystem - _open: ":/modules/../coffee-script/./lib/coffee-script/coffee-script.js" QMap(("mode", QVariant(QString, "r") ) )  
2013-07-23T15:38:43 [DEBUG] FileSystem - _open: ":/modules/../coffee-script/./lib/coffee-script/./lexer.js" QMap(("mode", QVariant(QString, "r") ) )  
2013-07-23T15:38:43 [DEBUG] FileSystem - _open: ":/modules/../coffee-script/./lib/coffee-script/././rewriter.js" QMap(("mode", QVariant(QString, "r") ) )  
2013-07-23T15:38:43 [DEBUG] FileSystem - _open: ":/modules/../coffee-script/./lib/coffee-script/././helpers.js" QMap(("mode", QVariant(QString, "r") ) )  
2013-07-23T15:38:43 [DEBUG] FileSystem - _open: ":/modules/../coffee-script/./lib/coffee-script/./parser.js" QMap(("mode", QVariant(QString, "r") ) )  
2013-07-23T15:38:43 [DEBUG] FileSystem - _open: ":/modules/../coffee-script/./lib/coffee-script/./helpers.js" QMap(("mode", QVariant(QString, "r") ) )  
2013-07-23T15:38:43 [DEBUG] FileSystem - _open: ":/modules/../coffee-script/./lib/coffee-script/./nodes.js" QMap(("mode", QVariant(QString, "r") ) )  
2013-07-23T15:38:43 [DEBUG] FileSystem - _open: ":/modules/../coffee-script/./lib/coffee-script/././scope.js" QMap(("mode", QVariant(QString, "r") ) )  
2013-07-23T15:38:43 [DEBUG] FileSystem - _open: ":/modules/../coffee-script/./lib/coffee-script/./././helpers.js" QMap(("mode", QVariant(QString, "r") ) )  
2013-07-23T15:38:43 [DEBUG] FileSystem - _open: ":/modules/../coffee-script/./lib/coffee-script/././lexer.js" QMap(("mode", QVariant(QString, "r") ) )  
2013-07-23T15:38:43 [DEBUG] FileSystem - _open: ":/modules/../coffee-script/./lib/coffee-script/./././rewriter.js" QMap(("mode", QVariant(QString, "r") ) )  
2013-07-23T15:38:43 [DEBUG] FileSystem - _open: ":/modules/webpage.js" QMap(("mode", QVariant(QString, "r") ) )  
2013-07-23T15:38:43 [DEBUG] FileSystem - _open: "/var/folders/5x/yw20597s0s3bql4dj98c8fzc0000zt/T/tmpiZAres/config.json" QMap(("mode", QVariant(QString, "r") ) )  
2013-07-23T15:38:43 [DEBUG] WebPage - updateLoadingProgress: 10 
2013-07-23T15:38:43 [DEBUG] Network - SSL Error: "The issuer certificate of a locally looked up certificate could not be found" 
2013-07-23T15:38:43 [DEBUG] Network - SSL Error: "The root CA certificate is not trusted for this purpose" 
2013-07-23T15:38:43 [DEBUG] Network - Resource request error: 6 ( "SSL handshake failed" ) URL: "https://broadway.squareup.com/market/sightglass-coffee" 
Loaded: https://broadway.squareup.com/market/sightglass-coffee
2013-07-23T15:38:43 [DEBUG] WebPage - updateLoadingProgress: 100 
page.onLoadFinished
Finished loading page: https://broadway.squareup.com/market/sightglass-coffee w/ status: fail
page.doDepictedScreenshots /var/folders/5x/yw20597s0s3bql4dj98c8fzc0000zt/T/tmpiZAres/capture.png
2013-07-23T15:38:43 [DEBUG] WebPage - setupFrame "" 
2013-07-23T15:38:43 [DEBUG] WebPage - setupFrame "" 
page.onInitialized
Taking the screenshot!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.