Giter Site home page Giter Site logo

jupyterhub / jupyter-server-proxy Goto Github PK

View Code? Open in Web Editor NEW
330.0 330.0 138.0 1.33 MB

Jupyter notebook server extension to proxy web services.

Home Page: https://jupyter-server-proxy.readthedocs.io

License: BSD 3-Clause "New" or "Revised" License

Python 83.45% JavaScript 1.40% TypeScript 10.72% RobotFramework 4.34% HTML 0.09%
binder jupyter jupyter-notebook jupyterhub proxy

jupyter-server-proxy's Introduction

Technical Overview | Installation | Configuration | Docker | Contributing | License | Help and Resources


Latest PyPI version Latest conda-forge version Documentation build status GitHub Workflow Status - Test Test coverage of code GitHub Discourse Gitter

With JupyterHub you can create a multi-user Hub that spawns, manages, and proxies multiple instances of the single-user Jupyter notebook server.

Project Jupyter created JupyterHub to support many users. The Hub can offer notebook servers to a class of students, a corporate data science workgroup, a scientific research project, or a high-performance computing group.

Technical overview

Three main actors make up JupyterHub:

  • multi-user Hub (tornado process)
  • configurable http proxy (node-http-proxy)
  • multiple single-user Jupyter notebook servers (Python/Jupyter/tornado)

Basic principles for operation are:

  • Hub launches a proxy.
  • The Proxy forwards all requests to Hub by default.
  • Hub handles login and spawns single-user servers on demand.
  • Hub configures proxy to forward URL prefixes to the single-user notebook servers.

JupyterHub also provides a REST API for administration of the Hub and its users.

Installation

Check prerequisites

  • A Linux/Unix based system

  • Python 3.8 or greater

  • nodejs/npm

    • If you are using conda, the nodejs and npm dependencies will be installed for you by conda.

    • If you are using pip, install a recent version (at least 12.0) of nodejs/npm.

  • If using the default PAM Authenticator, a pluggable authentication module (PAM).

  • TLS certificate and key for HTTPS communication

  • Domain name

Install packages

Using conda

To install JupyterHub along with its dependencies including nodejs/npm:

conda install -c conda-forge jupyterhub

If you plan to run notebook servers locally, install JupyterLab or Jupyter notebook:

conda install jupyterlab
conda install notebook

Using pip

JupyterHub can be installed with pip, and the proxy with npm:

npm install -g configurable-http-proxy
python3 -m pip install jupyterhub

If you plan to run notebook servers locally, you will need to install JupyterLab or Jupyter notebook:

python3 -m pip install --upgrade jupyterlab
python3 -m pip install --upgrade notebook

Run the Hub server

To start the Hub server, run the command:

jupyterhub

Visit http://localhost:8000 in your browser, and sign in with your system username and password.

Note: To allow multiple users to sign in to the server, you will need to run the jupyterhub command as a privileged user, such as root. The wiki describes how to run the server as a less privileged user, which requires more configuration of the system.

Configuration

The Getting Started section of the documentation explains the common steps in setting up JupyterHub.

The JupyterHub tutorial provides an in-depth video and sample configurations of JupyterHub.

Create a configuration file

To generate a default config file with settings and descriptions:

jupyterhub --generate-config

Start the Hub

To start the Hub on a specific url and port 10.0.1.2:443 with https:

jupyterhub --ip 10.0.1.2 --port 443 --ssl-key my_ssl.key --ssl-cert my_ssl.cert

Authenticators

Authenticator Description
PAMAuthenticator Default, built-in authenticator
OAuthenticator OAuth + JupyterHub Authenticator = OAuthenticator
ldapauthenticator Simple LDAP Authenticator Plugin for JupyterHub
kerberosauthenticator Kerberos Authenticator Plugin for JupyterHub

Spawners

Spawner Description
LocalProcessSpawner Default, built-in spawner starts single-user servers as local processes
dockerspawner Spawn single-user servers in Docker containers
kubespawner Kubernetes spawner for JupyterHub
sudospawner Spawn single-user servers without being root
systemdspawner Spawn single-user notebook servers using systemd
batchspawner Designed for clusters using batch scheduling software
yarnspawner Spawn single-user notebook servers distributed on a Hadoop cluster
wrapspawner WrapSpawner and ProfilesSpawner enabling runtime configuration of spawners

Docker

A starter docker image for JupyterHub gives a baseline deployment of JupyterHub using Docker.

Important: This quay.io/jupyterhub/jupyterhub image contains only the Hub itself, with no configuration. In general, one needs to make a derivative image, with at least a jupyterhub_config.py setting up an Authenticator and/or a Spawner. To run the single-user servers, which may be on the same system as the Hub or not, Jupyter Notebook version 4 or greater must be installed.

The JupyterHub docker image can be started with the following command:

docker run -p 8000:8000 -d --name jupyterhub quay.io/jupyterhub/jupyterhub jupyterhub

This command will create a container named jupyterhub that you can stop and resume with docker stop/start.

The Hub service will be listening on all interfaces at port 8000, which makes this a good choice for testing JupyterHub on your desktop or laptop.

If you want to run docker on a computer that has a public IP then you should (as in MUST) secure it with ssl by adding ssl options to your docker configuration or by using an ssl enabled proxy.

Mounting volumes will allow you to store data outside the docker image (host system) so it will be persistent, even when you start a new image.

The command docker exec -it jupyterhub bash will spawn a root shell in your docker container. You can use the root shell to create system users in the container. These accounts will be used for authentication in JupyterHub's default configuration.

Contributing

If you would like to contribute to the project, please read our contributor documentation and the CONTRIBUTING.md. The CONTRIBUTING.md file explains how to set up a development installation, how to run the test suite, and how to contribute to documentation.

For a high-level view of the vision and next directions of the project, see the JupyterHub community roadmap.

A note about platform support

JupyterHub is supported on Linux/Unix based systems.

JupyterHub officially does not support Windows. You may be able to use JupyterHub on Windows if you use a Spawner and Authenticator that work on Windows, but the JupyterHub defaults will not. Bugs reported on Windows will not be accepted, and the test suite will not run on Windows. Small patches that fix minor Windows compatibility issues (such as basic installation) may be accepted, however. For Windows-based systems, we would recommend running JupyterHub in a docker container or Linux VM.

Additional Reference: Tornado's documentation on Windows platform support

License

We use a shared copyright model that enables all contributors to maintain the copyright on their contributions.

All code is licensed under the terms of the revised BSD license.

Help and resources

We encourage you to ask questions and share ideas on the Jupyter community forum. You can also talk with us on our JupyterHub Gitter channel.

JupyterHub follows the Jupyter Community Guides.


Technical Overview | Installation | Configuration | Docker | Contributing | License | Help and Resources

jupyter-server-proxy's People

Contributors

athornton avatar betatim avatar bollwyvl avatar cmd-ntrf avatar consideratio avatar dependabot[bot] avatar derekheldtwerle avatar dipanjank avatar gbrault avatar iagomez avatar ian-r-rose avatar jacobtomlinson avatar janjagusch avatar janjaguschqc avatar jtpio avatar lsetiawan avatar mahnerak avatar manics avatar maresb avatar minrk avatar oeway avatar pre-commit-ci[bot] avatar rcthomas avatar rmorshea avatar rschroll avatar ryanlovett avatar ryshoooo avatar takluyver avatar willingc avatar yuvipanda avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

jupyter-server-proxy's Issues

Proxying relative URLs in redirects

In gitter I was chatting with @yuvipanda about the possibility of proxying one version of JupyterLab (specifically using --dev-mode) underneath the main one running in binder. https://gitter.im/jupyterhub/binder?at=5c42411983189945240fa7af

In trying this I ran into an issue with redirects. The proxied notebook server returns several redirects towards the default page, but when these redirects (e.g., to /lab), are received by the outer notebook server, they are then used to proxy to a page there rather than a page on the proxied server. This is not the effect I was looking for, and it seems to make pages in the proxied server entirely inaccessible if there is a redirect in the way.

This can be tested locally in this branch by running jupyter notebook --config=binder/config.py. Any thoughts on how to move forward?

I don't know a ton about the right language around proxies, so apologies if this is using the wrong language or is easily fixable :)

Proxying services on a named docker-compose network

If we run a Jupyter notebook server in a docker container and docker-compose with another application (eg OpenRefine) running a separate container, eg with a docker-compose script along the lines of:

version: '3.5'

services:
  refine:
    container_name: refine_container
    image: psychemedia/openrefinedemo
    # Let the jupyter server proxy expose the port
    #internal port 3333 is available on jupyternet as http://refine_container:3333
    #ports:
    #  - "3333:3333"
    networks:
      - jupyternet
    restart: unless-stopped

  jupyter:
    container_name: jupyter_notebook_container
    #image: jupyter/minimal-notebook
    build: ./
    environment:
      JUPYTER_TOKEN: 'letmein'
    ports:
      - "80:8888"
    networks:
      - jupyternet
    volumes:
      - ./jupyter_notebook_config.py:/home/jovyan/.jupyter/jupyter_notebook_config.py
    restart: unless-stopped

networks:
  jupyternet:
    driver: bridge

build a notebook sever container with the server proxy package installed:

FROM jupyter/minimal-notebook

RUN pip install git+https://github.com/jupyterhub/jupyter-server-proxy

and mounting a traitlet config file along the lines of:

c.ServerProxy.servers = {
    'openrefine': {
        #I don't need a command, server is already running?
        #'command': ['/home/jovyan/.openrefine/openrefine-2.8/refine', '-p', '{port}','-d','/home/jovyan/openrefine'],
        #Can we specify network: http://refine_container
        'port': 3333,
        'timeout': 120,
        'launcher_entry': {
            'title': 'OpenRefine'
        },
    },
}

is there a way of:

  • specifying that the service we want to proxy is on http://refine_container:3333 (which is available over jupyternet);
  • avoiding the start command (because the service we want to proxy is already free running in the connected container.

I guess another way to do it would be just to build OpenRefine into the notebook container and then letting the traitlet handle starting OpenRefine; but it would open up more possibilities in general if we could just compose the OpenRefine container in? [UPDATE: example]

PyPI wheel indicates Python 2 support

The wheel on PyPI (nbserverproxy-0.7-py2.py3-none-any.whl ) is marked as supporting Python 2 which is not true. Installing in a Python 2.7 environment via pip succeeds but enabling the server extension fails with a SyntaxError:

$ python -V
Python 2.7.14 :: Anaconda, Inc.
$ jupyter serverextension enable --py nbserverproxy
Traceback (most recent call last):
  File "/home/jhelmus/anaconda3/envs/ttt/bin/jupyter-serverextension", line 11, in <module>
    sys.exit(main())
  File "/home/jhelmus/anaconda3/envs/ttt/lib/python2.7/site-packages/jupyter_core/application.py", line 266, in launch_instance
    return super(JupyterApp, cls).launch_instance(argv=argv, **kwargs)
  File "/home/jhelmus/anaconda3/envs/ttt/lib/python2.7/site-packages/traitlets/config/application.py", line 658, in launch_instance
    app.start()
  File "/home/jhelmus/anaconda3/envs/ttt/lib/python2.7/site-packages/notebook/serverextensions.py", line 293, in start
    super(ServerExtensionApp, self).start()
  File "/home/jhelmus/anaconda3/envs/ttt/lib/python2.7/site-packages/jupyter_core/application.py", line 255, in start
    self.subapp.start()
  File "/home/jhelmus/anaconda3/envs/ttt/lib/python2.7/site-packages/notebook/serverextensions.py", line 210, in start
    self.toggle_server_extension_python(arg)
  File "/home/jhelmus/anaconda3/envs/ttt/lib/python2.7/site-packages/notebook/serverextensions.py", line 199, in toggle_server_extension_python
    m, server_exts = _get_server_extension_metadata(package)
  File "/home/jhelmus/anaconda3/envs/ttt/lib/python2.7/site-packages/notebook/serverextensions.py", line 327, in _get_server_extension_metadata
    m = import_item(module)
  File "/home/jhelmus/anaconda3/envs/ttt/lib/python2.7/site-packages/traitlets/utils/importstring.py", line 42, in import_item
    return __import__(parts[0])
  File "/home/jhelmus/anaconda3/envs/ttt/lib/python2.7/site-packages/nbserverproxy/__init__.py", line 1, in <module>
    from nbserverproxy.handlers import setup_handlers
  File "/home/jhelmus/anaconda3/envs/ttt/lib/python2.7/site-packages/nbserverproxy/handlers.py", line 53
    async def get(self, *args, **kwargs):
            ^
SyntaxError: invalid syntax

It would be nice if the incompatibility with Python 2 was expressed in a more obvious manner.

Request for extension: VNC Proxy

Now that nbserverproxy has websocket support, it'll be great to write an extension that utilizes https://github.com/novnc/noVNC (VNC in browser via websockets) to provide VNC inside JupyterHub. It should just require access to a X socket, which in JupyterHub deployments can be provided via Docker.

NoneType can't be used in 'await' expression - May be harmless

Ran into the following traceback when using dask-labextension with jupyter-server-proxy. This occurred with Tornado 6, but went away with Tornado 5. Am using JupyterLab 0.35.4, dask-labextension 0.3.1, Bokeh 1.0.4, and jupyter-server-proxy 1.0.0.

LabApp - ERROR - Uncaught exception GET /proxy/9100/status (10.110.73.60)
HTTPServerRequest(protocol='http', host='10.31.241.45:9999', method='GET', uri='/proxy/9100/status', version='HTTP/1.1', remote_ip='10.110.73.60')
Traceback (most recent call last):
  File "/home/nfs/jkirkham/miniconda/envs/rapidsdev37/lib/python3.7/site-packages/tornado/web.py", line 1699, in _execute
    result = await result
  File "/home/nfs/jkirkham/miniconda/envs/rapidsdev37/lib/python3.7/site-packages/jupyter_server_proxy/websocket.py", line 87, in get
    return await self.http_get(*args, **kwargs)
  File "/home/nfs/jkirkham/miniconda/envs/rapidsdev37/lib/python3.7/site-packages/jupyter_server_proxy/handlers.py", line 247, in http_get
    return await self.proxy(port, proxy_path)
TypeError: object NoneType can't be used in 'await' expression
LabApp - ERROR - Uncaught exception GET /proxy/9100/status (10.110.73.60)
HTTPServerRequest(protocol='http', host='10.31.241.45:9999', method='GET', uri='/proxy/9100/status', version='HTTP/1.1', remote_ip='10.110.73.60')
Traceback (most recent call last):
  File "/home/nfs/jkirkham/miniconda/envs/rapidsdev37/lib/python3.7/site-packages/tornado/web.py", line 1699, in _execute
    result = await result
  File "/home/nfs/jkirkham/miniconda/envs/rapidsdev37/lib/python3.7/site-packages/jupyter_server_proxy/websocket.py", line 87, in get
    return await self.http_get(*args, **kwargs)
  File "/home/nfs/jkirkham/miniconda/envs/rapidsdev37/lib/python3.7/site-packages/jupyter_server_proxy/handlers.py", line 247, in http_get
    return await self.proxy(port, proxy_path)
TypeError: object NoneType can't be used in 'await' expression

Setting up issue

Hi,

I'm relatively new to proxy servers so I apologize for my ignorance on the subject and ignorance on using this package.

I came to nbserverproxy via the dask-jobqueue project. I am hoping to tunnel my notebook from the HPC to my local machine (laptop) and also tunnel the dask dashboard which is typically served at http://localhost:8787/status. I'm having trouble accessing the dashboard and nbserverproxy was proposed as a solution. Unfortunately, I couldn't get nbserverproxy to work and I was wondering if I was doing something wrong in the setup?

The original issue is posted at dask/dask-jobqueue#116 but I realized I should probably ask here for help with nbserverproxy rather than the guys in dask-jobqueue.

I have an example below where I am trying to get the README example to work

(djq_lsf) [rxb826@pegasus ~]$ conda install -c conda-forge nbserverproxy
(djq_lsf) [rxb826@pegasus ~]$ jupyter serverextension enable --py nbserverproxy
Enabling: nbserverproxy
- Writing config: /nethome/rxb826/.jupyter
    - Validating...
      nbserverproxy  OK
(djq_lsf) [rxb826@pegasus ~]$ jupyter lab --no-browser
[I 00:13:06.932 LabApp] The Jupyter Notebook is running at:
[I 00:13:06.932 LabApp] http://localhost:8888/?token=...

Open http://localhost:8888 on my laptop and try the README example
screen shot 2018-08-04 at 12 21 12 am

screen shot 2018-08-04 at 12 21 47 am

screen shot 2018-08-04 at 12 23 10 am

Spark UI not accessible

Hi everybody,

as @ryanlovett asked me I opened this issue here, related to jupyterhub/zero-to-jupyterhub-k8s#1030.
The Problem is as following:

After starting PySpark I am not able to access the Spark UI, resulting in a Jupyterhub 404 error.
Here are (hopefully) all the relevant Information:

  1. I create a new user image from the from the jupyter/pyspark image
  2. The Dockerfile for this image contains:
FROM jupyter/pyspark-notebook:5b2160dfd919
RUN pip install nbserverproxy
RUN jupyter serverextension enable --py nbserverproxy
USER root
RUN echo โ€œ$NB_USER ALL=(ALL) NOPASSWD:ALLโ€ > /etc/sudoers.d/notebook
USER $NB_USER
  1. I create the SparkContext() in the pod, created with respective image which gives me the link to the UI.
  2. The SparkContext() is created with the following config:
conf.setMaster('k8s://https://'+ os.environ['KUBERNETES_SERVICE_HOST'] +':443')
conf.set('spark.kubernetes.container.image', 'idalab/spark-py:spark')
conf.set('spark.submit.deployMode', 'client')
conf.set('spark.executor.instances', '2')
conf.setAppName('pyspark-shell')
conf.set('spark.driver.host', '10.16.205.42 ')
os.environ['PYSPARK_PYTHON'] = 'python3'
os.environ['PYSPARK_DRIVER_PYTHON'] = 'python3'
  1. The link created by Spark is obviously not accessible on the hub as it points to <POD_IP>:4040
  2. I try to access the UI via .../username/proxy/4040 and .../username/proxy/4040/ both don't work and lead to a Jupyterhub 404.
  3. Other ports are accessible via this method so I assume nbserverextension is working correctly.
  4. This is the output of npnetstat -pl:
Proto Recv-Q Send-Q Local Address           Foreign Address         State       PID/Program name
tcp        0      0 localhost:54695         0.0.0.0:*               LISTEN      23/python
tcp        0      0 localhost:33896         0.0.0.0:*               LISTEN      23/python
tcp        0      0 localhost:34577         0.0.0.0:*               LISTEN      23/python
tcp        0      0 localhost:52211         0.0.0.0:*               LISTEN      23/python
tcp        0      0 0.0.0.0:8888            0.0.0.0:*               LISTEN      7/python
tcp        0      0 localhost:39388         0.0.0.0:*               LISTEN      23/python
tcp        0      0 localhost:39971         0.0.0.0:*               LISTEN      23/python
tcp        0      0 localhost:32867         0.0.0.0:*               LISTEN      23/python
tcp6       0      0 jupyter-hagen:43878     [::]:*                  LISTEN      45/java
tcp6       0      0 [::]:4040               [::]:*                  LISTEN      45/java
tcp6       0      0 localhost:32816         [::]:*                  LISTEN      45/java
tcp6       0      0 jupyter-hagen:41793     [::]:*                  LISTEN      45/java

One can see that the java processes have another format due to tcp6

  1. To check if this is the error I set the environment variable '_JAVA_OPTIONS' set to "-Djava.net.preferIPv4Stack=true" .

  2. This results in the following output but does not resolve the problem:

Proto Recv-Q Send-Q Local Address           Foreign Address         State       PID/Program name
tcp        0      0 localhost:54695         0.0.0.0:*               LISTEN      456/python
tcp        0      0 0.0.0.0:4040            0.0.0.0:*               LISTEN      475/java
tcp        0      0 localhost:33896         0.0.0.0:*               LISTEN      456/python
tcp        0      0 localhost:34990         0.0.0.0:*               LISTEN      475/java
tcp        0      0 localhost:36079         0.0.0.0:*               LISTEN      456/python
tcp        0      0 jupyter-hagen:35119     0.0.0.0:*               LISTEN      475/java
tcp        0      0 localhost:34577         0.0.0.0:*               LISTEN      456/python
tcp        0      0 jupyter-hagen:42195     0.0.0.0:*               LISTEN      475/java
tcp        0      0 localhost:34836         0.0.0.0:*               LISTEN      456/python
tcp        0      0 0.0.0.0:8888            0.0.0.0:*               LISTEN      7/python
tcp        0      0 localhost:39971         0.0.0.0:*               LISTEN      456/python
tcp        0      0 localhost:32867         0.0.0.0:*               LISTEN      456/python
  1. I checked, whether the UI is generally accessible by running a local version of the user image on my PC and forwarding the port. That works fine!

My user image is available on docker hub at idalab/spark-user:1.0.2 so this should be easy to inject for debugging if neccessary.

Thank you for your help.

Binary websockets currently not working (easy fix)

Hi,

first of all, thanks for the great project, I was halfway through implementing something similar myself when I found it.

I'm using it to proxy a VNC connection (like nbnovnc) which uses a binary websocket. I found that connection could be established but traffic would stop after the RFB handshake (which just states the protocol version in all ASCII characters).

Turns out, binary messages from the backend are not relayed properly to the frontend. The reason is https://github.com/jupyterhub/nbserverproxy/blob/master/nbserverproxy/handlers.py#L163

Similar to when proxying from frontend to backend, it should instead be:

self.ws.write_message(message, binary=type(message) is bytes)

With that, e.g., nbnovnc runs perfectly again.

Cheers!

Proxying Xpra is not working

UDP: a way to reproduce: #35 (comment)
UPD: proposed fix: #35 (comment)

I've tried to proxy Xpra html5 client with nbserverproxy (version 0.8.3), but it doesn't seem to work.
When I start Xpra on a local machine with the

xpra start --bind-tcp=0.0.0.0:14500 --html=on --start=xterm

command and open localhost:14500, I see xterm in my browser.

I've tried to run this command from the jupyter terminal and then access .../proxy/14500.
It struggles to Update connection to WebSocket.

When I connect directly, the response to Upgrade request is

HTTP/1.1 101 Switching Protocols
Server: Xpra-WebSockify Python/2.7.5
Date: Wed, 30 May 2018 09:46:33 GMT
Upgrade: websocket
Connection: Upgrade
Sec-WebSocket-Accept: ***
Sec-WebSocket-Protocol: binary
Expires: 0
Pragma: no-cache
Cache-Control: no-cache, no-store, must-revalidate

But behind nbserverproxy the response is

HTTP/1.1 302 Found
Server: TornadoServer/5.0.2
Content-Type: text/html; charset=UTF-8
Date: Wed, 30 May 2018 09:45:42 GMT
Location: //proxy/14500
Content-Length: 0

There is Wiki page about proxying Xpra with Nginx: https://www.xpra.org/trac/wiki/Nginx
AFAIU it makes Nginx to do something to HTTP headers so that websockets work behind a proxy.

I know about https://github.com/ryanlovett/nbnovnc/, but I would like to see Xpra working.

Make it a standalone service

It would be useful to have this as a standalone service, to enable deploying any web application behind
JupyterHub auth, not just as a notebook extension.

The package could still provide a serverextension for the existing case.

0.7.1 git tag and PyPI sdist

nbserverproxy 0.7.1 is available as a whl on PyPI. Would it be possible to tag the release in git and provide a sdist on PyPI? Both of these would be helpful for downstream distributors of the software.

Configure proxy request options with traitlets

One can subclass to specify the proxy request options, but it'd be convenient not to have to do that. nbstencila alters request timeouts and I've found I'll need to do the same for syncthing.

WebSocketHandlerMixin assertion error

I started a web service via python -m http.server 8001 and visited proxy/8001. The user server errored with 503 and the log file contained:

[E 2018-01-24 00:10:12.962 SingleUserNotebookApp http1connection:54] Uncaught exception
    Traceback (most recent call last):
      File "/srv/conda/envs/data100/lib/python3.6/site-packages/tornado/http1connection.py", line 238, in _read_message
        delegate.finish()
      File "/srv/conda/envs/data100/lib/python3.6/site-packages/tornado/httpserver.py", line 314, in finish
        self.delegate.finish()
      File "/srv/conda/envs/data100/lib/python3.6/site-packages/tornado/routing.py", line 251, in finish
        self.delegate.finish()
      File "/srv/conda/envs/data100/lib/python3.6/site-packages/tornado/web.py", line 2097, in finish
        self.execute()
      File "/srv/conda/envs/data100/lib/python3.6/site-packages/tornado/web.py", line 2117, in execute
        **self.handler_kwargs)
      File "/srv/conda/envs/data100/lib/python3.6/site-packages/nbserverproxy/handlers.py", line 21, in __init__
        assert WebSocketHandlerMixin in bases
    AssertionError

Run tests during CI

Bumping the version triggers a new pypi upload (#49), but we should run tests prior to doing so. See the script: stanza in .travis.yml.

Usage with `dask-labextension`

This is a follow-up with from some gitter chatting with @yuvipanda.

The dask-labextension JupyterLab extension currently involves adding iframe'd bokeh dashboard panels to the main work area. In order to get around CORS issues, it includes a modified version of nbserverproxy to proxy them under the notebook server origin. It is possible to do the same thing with an unmodified version of the package, but there were a couple of things that made me include the modified version:

  1. I would like to leave open the possibility that the dashboard URL is not on localhost. Our current examples are all on localhost, but I don't view that as a requirement that needs to be enforced, and we want to be able to support a wide variety of dask clusters which may have different deployment patterns that we haven't necessarily thought of.
  2. The dask labextension includes rest endpoints for cluster management as well as dashboard proxying. Currently, the cluster management endpoint is at /dask/clusters/<cluster-id>, and the corresponding dashboard is at /dask/dashboards/<cluster-id>. I'd like to be able to continue this pattern.

So there are two things that would be nice for nbserverproxy (or jupyter-server-proxy) to allow to be configured:

  1. The ability to proxy to arbitrary URLs, rather than just ports on localhost.
  2. The ability to place a given proxy at a different endpoint on the notebook server than /proxy/<port>/<path>.

I know that @yuvipanda has put some significant work into revamping this package recently, so it is possible that one can already do this. If that's the case, do let me know!

Websocket can't initialize with Tornado 4.5.1

Running this with nbrsessionproxy and got the following exception when trying to establish a websocket connection:

[I 2018-05-07 21:32:51.808 sen_fang handlers:136] Trying to establish websocket connection to ws://127.0.0.1:60981/p/6758/websocket/
[E 2018-05-07 21:32:51.808 sen_fang ioloop:638] Exception in callback functools.partial(<function wrap.<locals>.null_wrapper at 0x7f6d9c9598c8>, <tornado.concurrent.Future object at 0x7f6d9cc159e8>)
    Traceback (most recent call last):
      File "/opt/conda/envs/py36/lib/python3.6/site-packages/tornado/ioloop.py", line 605, in _run_callback
        ret = callback()
      File "/opt/conda/envs/py36/lib/python3.6/site-packages/tornado/stack_context.py", line 277, in null_wrapper
        return fn(*args, **kwargs)
      File "/opt/conda/envs/py36/lib/python3.6/site-packages/tornado/ioloop.py", line 626, in _discard_future_result
        future.result()
      File "/opt/conda/envs/py36/lib/python3.6/site-packages/tornado/concurrent.py", line 238, in result
        raise_exc_info(self._exc_info)
      File "<string>", line 4, in raise_exc_info
      File "/opt/conda/envs/py36/lib/python3.6/site-packages/tornado/gen.py", line 307, in wrapper
        yielded = next(result)
      File "<string>", line 6, in _wrap_awaitable
      File "/opt/conda/envs/py36/lib/python3.6/site-packages/nbserverproxy/handlers.py", line 140, in start_websocket_connection
        on_message_callback=message_cb, on_ping_callback=ping_cb)
      File "/opt/conda/envs/py36/lib/python3.6/site-packages/nbserverproxy/handlers.py", line 55, in pingable_ws_connect
        on_ping_callback=on_ping_callback)
      File "/opt/conda/envs/py36/lib/python3.6/site-packages/nbserverproxy/handlers.py", line 35, in __init__
        super().__init__(**kwargs)
      File "/opt/conda/envs/py36/lib/python3.6/site-packages/tornado/websocket.py", line 1076, in __init__
        104857600, self.tcp_client, 65536, 104857600)
      File "/opt/conda/envs/py36/lib/python3.6/site-packages/tornado/simple_httpclient.py", line 196, in __init__
        self.start_time = io_loop.time()
    AttributeError: 'NoneType' object has no attribute 'time'

It looks like because io_loop is set to None but Tornado expects to use it at here:
beefebc#diff-ca755767ba44dc77b227f5de4d62ed08R34
https://github.com/tornadoweb/tornado/blob/branch4.5/tornado/websocket.py#L1082
https://github.com/tornadoweb/tornado/blob/branch4.5/tornado/simple_httpclient.py#L196

Allow user to specify port

From exploring running various services using jupyter-server-proxy:

  • some services run on generally recognised ports, albeit allowing users to start pass a port number as an argument to the command that starts the service (eg OpenRefine on port 3333); downstream applications may assume this port number;
  • some services appear to run on a default port that cannot be reassigned as part as the start-up command (eg pgAdmin4?)
  • the parameter used to set the port may be arbitrarily named and as such hard to identify unambiguously in a command string;

It would be useful to be able to specify a default port number, e.g as:

c.ServerProxy.servers = {
    'aService': {
        'command': ['serviceCommand', '--p', '{port}'],
        'timeout': 120,
        'port' : MYPORT, #an optional argument specifying the required port number
        'launcher_entry': {
            'enabled': True,
            'title': 'myService',
        },
    },
}

See also: #81

Proxy breaks relative URLs in proxied pages

Test setup / VM:

  • Jupyter notebook on guest port 8888, host port 35180
  • OpenRefine on guest 3334
  • nbserverproxy enabled
  • OpenRefine proxied as http://localhost:35180/proxy/3334

The OpenRefine page doesn't render because the OpenRefine HTML includes relative links to assets:

...
<link type="text/css" rel="stylesheet" href="externals/select2/select2.css" />
 <link type="text/css" rel="stylesheet" href="externals/tablesorter/theme.blue.css" />
...

which resolve as e.g. http://localhost:35180/proxy/externals/select2/select2.css (404) rather than http://localhost:35180/proxy/3334/externals/select2/select2.css (which would resolve correctly)?

Issue with link '/' stripping?

In binder-examples/r#13 we found some strange behavior. It seems like you get drastically different link resolving if there is a / at the end of the URL or not when using nbserverproxy. For example, the following link:

http://mybinder.org/v2/gh/binder-examples/r/master?urlpath=shiny/bus-dashboard

resolves incorrectly to

https://hub.mybinder.org/hub/bus-dashboard/?token=

while the same link with a slash at the end:

http://mybinder.org/v2/gh/binder-examples/r/master?urlpath=shiny/bus-dashboard/

resolves correctly to

https://hub.mybinder.org/user/binder-examples-r-2i4xen56/shiny/bus-dashboard/?token=<MYTOKEN>

Is this expected behavior?

Client sent subprotocols: []

Got this exception recently when using this to view the Dask Distributed Dashboard using nbserverproxy while running a Jupyter Notebook.

Client sent subprotocols: []
ERROR:asyncio:Future exception was never retrieved
future: <Future finished exception=IndexError('list index out of range',)>
Traceback (most recent call last):
  File "/opt/conda3/lib/python3.6/site-packages/tornado/gen.py", line 326, in wrapper
    yielded = next(result)
  File "/opt/conda3/lib/python3.6/site-packages/tornado/websocket.py", line 721, in _accept_connection
    self.selected_subprotocol = self.handler.select_subprotocol(subprotocols)
  File "/opt/conda3/lib/python3.6/site-packages/nbserverproxy/handlers.py", line 300, in select_subprotocol
    return subprotocols[0]
IndexError: list index out of range

_xsrf' argument missing from POST

Hey everyone!

we are using jupyter-server-proxy to start an apollo graphql server, but after the graphql playgrand opens (<jupyterlab_url>/metadata) it is raises 403 error.

If I try to use http://localhost:57821/ directly it works.

any idea?

some log/info:

[I 11:16:07.265 LabApp] 302 GET /metadata (127.0.0.1) 0.96ms
[D 11:16:07.270 LabApp] Trying to start metadata
[D 11:16:07.277 LabApp] Started metadata
[D 11:16:07.291 LabApp] Connection to http://localhost:57821 refused
[D 11:16:07.292 LabApp] Readyness: False after 0.01347208023071289 seconds, next check in 0.01s
[D 11:16:07.319 LabApp] Connection to http://localhost:57821 refused
[D 11:16:07.319 LabApp] Readyness: False after 0.04051947593688965 seconds, next check in 0.02s
[D 11:16:07.347 LabApp] Connection to http://localhost:57821 refused
[D 11:16:07.347 LabApp] Readyness: False after 0.06887602806091309 seconds, next check in 0.04s
[D 11:16:07.393 LabApp] Connection to http://localhost:57821 refused
[D 11:16:07.393 LabApp] Readyness: False after 0.11508560180664062 seconds, next check in 0.08s
[D 11:16:07.476 LabApp] Connection to http://localhost:57821 refused
[D 11:16:07.476 LabApp] Readyness: False after 0.1979846954345703 seconds, next check in 0.16s
[D 11:16:07.640 LabApp] Connection to http://localhost:57821 refused
[D 11:16:07.641 LabApp] Readyness: False after 0.3623690605163574 seconds, next check in 0.32s
[D 11:16:07.963 LabApp] Connection to http://localhost:57821 refused
[D 11:16:07.964 LabApp] Readyness: False after 0.6852359771728516 seconds, next check in 0.64s
[D 11:16:08.068 LabApp] Accepting token-authenticated connection from 127.0.0.1
[D 11:16:08.070 LabApp] 204 PUT /lab/api/workspaces/lab?1548947768063 (127.0.0.1) 3.38ms
[D 11:16:08.609 LabApp] Connection to http://localhost:57821 refused
[D 11:16:08.609 LabApp] Readyness: False after 1.3308353424072266 seconds, next check in 1.28s
๐Ÿš€ Server ready at http://localhost:57821/
[D 11:16:09.922 LabApp] Got code 400 back from http://localhost:57821
[D 11:16:09.922 LabApp] Readyness: True after 2.644043207168579 seconds, next check in 2.56s
[D 11:16:09.931 LabApp] 304 GET /metadata/ (127.0.0.1) 2662.15ms
[W 11:16:10.924 LabApp] 403 POST / (127.0.0.1): '_xsrf' argument missing from POST
[W 11:16:10.925 LabApp] 403 POST / (127.0.0.1) 2.22ms referer=http://localhost:8888/metadata/
[W 11:16:15.991 LabApp] 403 POST / (127.0.0.1): '_xsrf' argument missing from POST
[W 11:16:15.992 LabApp] 403 POST / (127.0.0.1) 0.88ms referer=http://localhost:8888/metadata/
[W 11:16:17.013 LabApp] 403 POST / (127.0.0.1): '_xsrf' argument missing from POST
[W 11:16:17.013 LabApp] 403 POST / (127.0.0.1) 1.06ms referer=http://localhost:8888/metadata/

headers from localhost:8888/metadata:

Response Header
==============
HTTP/1.1 403 Forbidden
Server: TornadoServer/5.1.1
Content-Type: text/html
Date: Thu, 31 Jan 2019 15:28:05 GMT
Content-Security-Policy: frame-ancestors 'self'; report-uri /api/security/csp-report
Content-Length: 6171
Set-Cookie: _xsrf=2|6af4b6fe|ae274370a8f898419f71895f21591947|1548948485; Path=/

Request Header
============
POST /metadata HTTP/1.1
Host: localhost:8888
Connection: keep-alive
Content-Length: 1468
accept: */*
Origin: http://localhost:8888
X-Apollo-Tracing: 1
User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/70.0.3538.77 Safari/537.36
content-type: application/json
Referer: http://localhost:8888/metadata/
Accept-Encoding: gzip, deflate, br
Accept-Language: es-419,es;q=0.9,en-US;q=0.8,en;q=0.7,pt;q=0.6,gl;q=0.5,la;q=0.4

headers from http://localhost:57821/

Response headers
==============
HTTP/1.1 200 OK
X-Powered-By: Express
Access-Control-Allow-Origin: *
Content-Type: application/json
Content-Length: 25779
Date: Thu, 31 Jan 2019 15:26:49 GMT
Connection: keep-alive

Request headers
=============
POST / HTTP/1.1
Host: localhost:57821
Connection: keep-alive
Content-Length: 1468
accept: */*
Origin: http://localhost:57821
X-Apollo-Tracing: 1
User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/70.0.3538.77 Safari/537.36
content-type: application/json
Referer: http://localhost:57821/
Accept-Encoding: gzip, deflate, br
Accept-Language: es-419,es;q=0.9,en-US;q=0.8,en;q=0.7,pt;q=0.6,gl;q=0.5,la;q=0.4

Example extending docker-stacks?

I have a dockerfile for a jupyterlab that is customized to our undergrad computing environment.

I'd like to add a theia or coding.com version of Visual Studio IDE (as well as rstudio etc). I've been reading through the dox and jupyterserverproxy-openrefine which uses binder.

That said, it would be very useful to have an example that uses one of the docker-stacks examples with an extension to have a sample extension (e.g. theia, rstudio). Am I missing this in the documentation?

Support for HTTPS

I am trying to use this server-proxy to talk to an application that accepts only https requests(requires SSL). Is there a way to update this to support https ?

postgres example?

Do you have any examples of starting up postgres using a trailet? Tried the below but no luck so far. Postgres is set up in the same container as jupyter (lab and hub).

'postgres': {
    'command': ['/usr/lib/postgresql/10/bin/postgres -c config_file=/etc/postgresql/10/main/postgresql.conf -p {port}'],
    'launcher_entry': {
        'title': 'PostgreSQL'
    },
}

Cutting a new release

The subprotocols fix hasn't made it into a release or a PyPi package. Please publish a new release and upload to PyPi so I can just pip install a packagename instead of from a git repo.

500 server error - timing out on service start

Trying to run the OpenRefine package demo as hinted at in the docs, I get a 500 internal server error โ€” could not start openrefine in time.

Is there an obvious way of setting the time out period?

Also, I can start the OpenRefine server fine from a terminal using a similar command to the one defined in the setup, but if I then try to visit the proxied URL I get the same error; i.e. even though the application is running against the proxied port, it doesn't appear: jupyter-server-proxy is presumably try to initiate it itself?

Support for websocket compression

This is currently failing for me when proxying from a websocket client that requests permessage-deflate compression:

    Traceback (most recent call last):
      File "/opt/conda/lib/python3.6/site-packages/tornado/http1connection.py", line 185, in _read_message
        header_future = delegate.headers_received(start_line, headers)
      File "/opt/conda/lib/python3.6/site-packages/tornado/http1connection.py", line 651, in headers_received
        return self._delegate.headers_received(start_line, headers)
      File "/opt/conda/lib/python3.6/site-packages/tornado/websocket.py", line 1185, in headers_received
        self.protocol._process_server_headers(self.key, self.headers)
      File "/opt/conda/lib/python3.6/site-packages/tornado/websocket.py", line 783, in _process_server_headers
        raise ValueError("unsupported extension %r", ext)
    ValueError: ('unsupported extension %r', ('permessage-deflate', {'client_max_window_bits': '15'}))

I was able to get in working in my case by changing the source code to instantiate the PingableWSClientConnection with compression_options={}. Not sure if this is the correct fix, or if there's a better way to add support for compression.

Add a banner/wrapper/iframe to proxied content

How feasible is it to add a header/banner/iframe around the proxied content?

The use case would be to have a little bit of screen space above a shiny app or RStudio that is deployed via mybinder.org in which we can show the mybinder.org logo or some such.

Doesn't work with applications which use absolute URLs

I've noticed that a few applications that I've tried to proxy recently use absolute urls.

For example when running on a regular server if you go to http://localhost:1234/ it will look for a stylesheet at/style/index.css. This works fine when running locally but when using nbserverproxy I visit http://localhost:8888/lab/proxy/1234 which still looks for styles at /style/index.css rather than /lab/proxy/1234/style/index.css.

These applications often take a baseurl config option somewhere which lets you prepend all urls with a base path, such as /lab/proxy/1234. This would allow me to fix this functionality. However it appears nbserverproxy rewrites its urls so when I visit http://localhost:8888/lab/proxy/1234 the path gets rewritten to /. This then causes problems because the application is now expecting a base url of /lab/proxy/1234/. This results in the HTML page not displaying because flask or whatever doesn't have a route for /, but the CSS would display ok if it were ever requested.

Could not start metadata in time

hi everyone!

I am trying to use the server-proxy to start a graphql server to be used by jupyterlab extension.

but it is raising a error. maybe something related to timeout ... I checked the issue #81 .. but I not sure yet how to fix this problem. any idea? thanks

[I 15:39:51.677 LabApp] 302 GET /metadata (127.0.0.1) 3.38ms
[D 15:39:51.725 LabApp] Trying to start metadata
[D 15:39:51.753 LabApp] Started metadata
[D 15:39:51.792 LabApp] Connection to http://localhost:43571 refused
[D 15:39:51.792 LabApp] Readyness: False after 0.03717780113220215 seconds, next check in 0.01s
[D 15:39:51.804 LabApp] Connection to http://localhost:43571 refused
[D 15:39:51.805 LabApp] Readyness: False after 0.05013012886047363 seconds, next check in 0.02s
[D 15:39:51.828 LabApp] Connection to http://localhost:43571 refused
[D 15:39:51.829 LabApp] Readyness: False after 0.07472777366638184 seconds, next check in 0.04s
[D 15:39:51.873 LabApp] Connection to http://localhost:43571 refused
[D 15:39:51.873 LabApp] Readyness: False after 0.11868023872375488 seconds, next check in 0.08s
[D 15:39:51.956 LabApp] Connection to http://localhost:43571 refused
[D 15:39:51.956 LabApp] Readyness: False after 0.20161056518554688 seconds, next check in 0.16s
[D 15:39:52.119 LabApp] Connection to http://localhost:43571 refused
[D 15:39:52.119 LabApp] Readyness: False after 0.36438775062561035 seconds, next check in 0.32s
[D 15:39:52.443 LabApp] Connection to http://localhost:43571 refused
[D 15:39:52.444 LabApp] Readyness: False after 0.6892299652099609 seconds, next check in 0.64s
[D 15:39:53.088 LabApp] Connection to http://localhost:43571 refused
[D 15:39:53.088 LabApp] Readyness: False after 1.3332841396331787 seconds, next check in 1.28s
[D 15:39:53.485 LabApp] Accepting token-authenticated connection from 127.0.0.1
[D 15:39:53.488 LabApp] 200 GET /api/sessions?1548272393481 (127.0.0.1) 3.66ms
[D 15:39:53.490 LabApp] Accepting token-authenticated connection from 127.0.0.1
[D 15:39:53.491 LabApp] 200 GET /api/terminals?1548272393483 (127.0.0.1) 1.39ms
[D 15:39:53.975 LabApp] Accepting token-authenticated connection from 127.0.0.1
[D 15:39:53.996 LabApp] 200 GET /api/contents/?content=1&1548272393970 (127.0.0.1) 22.11ms
[D 15:39:54.379 LabApp] Connection to http://localhost:43571 refused
[D 15:39:54.381 LabApp] Readyness: False after 2.626580238342285 seconds, next check in 2.3751447105407717s
Type "OrgPerson" is missing a "__resolveType" resolver. Pass false into "resolverValidationOptions.requireResolversForResolveType" to disable this warning.
๐Ÿš€ Server ready at http://localhost:43571/
[W 15:39:56.799 LabApp] 500 GET /metadata/ (127.0.0.1): could not start metadata in time
[D 15:39:56.802 LabApp] Using contents: services/contents
[D 15:39:56.804 LabApp] Using contents: services/contents
[D 15:39:56.878 LabApp] Path base/images/favicon.ico served from /mnt/sda1/storage/miniconda/envs/jlab-metadata-service-frontend/lib/python3.7/site-packages/notebook/static/base/images/favicon.ico
[D 15:39:56.880 LabApp] Path components/jquery-ui/themes/smoothness/jquery-ui.min.css served from /mnt/sda1/storage/miniconda/envs/jlab-metadata-service-frontend/lib/python3.7/site-packages/notebook/static/components/jquery-ui/themes/smoothness/jquery-ui.min.css
[D 15:39:56.890 LabApp] Path components/jquery-typeahead/dist/jquery.typeahead.min.css served from /mnt/sda1/storage/miniconda/envs/jlab-metadata-service-frontend/lib/python3.7/site-packages/notebook/static/components/jquery-typeahead/dist/jquery.typeahead.min.css
[D 15:39:56.891 LabApp] Path style/style.min.css served from /mnt/sda1/storage/miniconda/envs/jlab-metadata-service-frontend/lib/python3.7/site-packages/notebook/static/style/style.min.css
[D 15:39:56.896 LabApp] Path components/es6-promise/promise.min.js served from /mnt/sda1/storage/miniconda/envs/jlab-metadata-service-frontend/lib/python3.7/site-packages/notebook/static/components/es6-promise/promise.min.js
[D 15:39:56.908 LabApp] Path components/preact/index.js served from /mnt/sda1/storage/miniconda/envs/jlab-metadata-service-frontend/lib/python3.7/site-packages/notebook/static/components/preact/index.js
[D 15:39:56.913 LabApp] Path components/proptypes/index.js served from /mnt/sda1/storage/miniconda/envs/jlab-metadata-service-frontend/lib/python3.7/site-packages/notebook/static/components/proptypes/index.js
[D 15:39:56.914 LabApp] Path components/preact-compat/index.js served from /mnt/sda1/storage/miniconda/envs/jlab-metadata-service-frontend/lib/python3.7/site-packages/notebook/static/components/preact-compat/index.js
[D 15:39:56.921 LabApp] Path components/requirejs/require.js served from /mnt/sda1/storage/miniconda/envs/jlab-metadata-service-frontend/lib/python3.7/site-packages/notebook/static/components/requirejs/require.js
[D 15:39:56.930 LabApp] Path base/images/logo.png served from /mnt/sda1/storage/miniconda/envs/jlab-metadata-service-frontend/lib/python3.7/site-packages/notebook/static/base/images/logo.png
[E 15:39:56.931 LabApp] {
      "Host": "localhost:8888",
      "Connection": "keep-alive",
      "Upgrade-Insecure-Requests": "1",
      "User-Agent": "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/70.0.3538.77 Safari/537.36",
      "Accept": "text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8",
      "Accept-Encoding": "gzip, deflate, br",
      "Accept-Language": "es-419,es;q=0.9,en-US;q=0.8,en;q=0.7,pt;q=0.6,gl;q=0.5,la;q=0.4",
      "Cookie": "_ga=GA1.1.726587293.1522349382; EULA=true; Pycharm-3d967870=6b642f5c-9f96-4f91-9e5f-dea863476b55; _xsrf=2|18aeb458|c22d667bca4848416f241db8e3966ee4|1545946955; csrftoken=DGOVTLqw4MoHlAzB3XHPzNjWK8qNGqfrcaW8JHc7sdI6kYcTbroAXgeV1PxqP27s; sessionid=jkc7716ca5cb07hl1j8ldc2f1t1zsztp; __wzdeb7fdcc3aa3f447d6ef8=1546959333|479ba50cdfc6; username-localhost-8889=\"2|1:0|10:1548103374|23:username-localhost-8889|44:YjAxNWNhMGE2N2NkNDQ3MGEzMGE4YjlhMDk2YTc4MDY=|d9669ea076876cea473fb85df4a5fb542c1e42616c8be93e4e39e87aefd45ca3\"; username-localhost-8888=\"2|1:0|10:1548272384|23:username-localhost-8888|44:ZGY5ZjcxMDk2Yjc3NDA4YmJmYzhkN2I1NDgzOTUwNmY=|4533f199088e7c170a3cff4001a2742418f7d7fbbfbc8b9d88d618f300ef385b\""
    }
[E 15:39:56.931 LabApp] 500 GET /metadata/ (127.0.0.1) 5215.92ms referer=None
[D 15:39:56.969 LabApp] Path custom.css served from /mnt/sda1/storage/miniconda/envs/jlab-metadata-service-frontend/lib/python3.7/site-packages/notebook/static/custom/custom.css
[D 15:39:56.971 LabApp] 304 GET /custom/custom.css (127.0.0.1) 3.55ms
[D 15:39:57.192 LabApp] Path components/jquery/jquery.min.js served from /mnt/sda1/storage/miniconda/envs/jlab-metadata-service-frontend/lib/python3.7/site-packages/notebook/static/components/jquery/jquery.min.js
[D 15:39:57.195 LabApp] 200 GET /static/components/jquery/jquery.min.js?v=20190123153454 (127.0.0.1) 4.27ms


websocket connection failed

Hi ,
I am using nbserverproxy to proxy one of my services at a port. So when I am doing it gives me an error at console which is WebSocket connection to 'ws://jupyterhub.domain.com/api/kernels/ffe8af9a-9052-48fb-ada5-dccd51b08eef/channels?session_id=37504c0eb91537c8d8877a701f8141ee' failed: Error during WebSocket handshake: Unexpected response code: 502

PS : I have one doubt more like I am calling url https://jupyterhub.elucidata.io/user/{someuser}/proxy/8866/ and it gives websocket connection error like this https://jupyterhub.elucidata.io/api/kernels/{some kernel id}/channels?session_id={session id} . I want to know why its calling kernel api directly with jupyterhub domain.

Thanks in advance
Vidit

More robust handling of broken entrypoints

Can we/should we make jupyter server proxy more robust against broken entrypoints? I just had the case where I forgot to import a module in my package that has an entrypoint for server proxy and as a result none of the jupyter server proxy worked.

[W 13:54:48.009 NotebookApp] Error loading server extension jupyter_server_proxy
    Traceback (most recent call last):
      File "/srv/conda/lib/python3.7/site-packages/notebook/notebookapp.py", line 1575, in init_server_extensions
        func(self)
      File "/srv/conda/lib/python3.7/site-packages/jupyter_server_proxy/__init__.py", line 27, in load_jupyter_server_extension
        server_proccesses += get_entrypoint_server_processes()
      File "/srv/conda/lib/python3.7/site-packages/jupyter_server_proxy/config.py", line 68, in get_entrypoint_server_processes
        make_server_process(entry_point.name, entry_point.load()())
      File "/home/thead/jupyter_vscode_proxy/__init__.py", line 21, in setup_vscode
        'icon_path': os.path.join(os.path.dirname(os.path.abspath(__file__)), 'icons', 'vscode.svg')
    NameError: name 'os' is not defined

502 During Web Socket Handshake

I'm trying to get web sockets to work when running a Sanic server via proxy on https://mybinder.org.

I don't think I know enough to understand whether or not the 502 Bad Gateway is resulting from a problem with Sanic or with nbserverproxy (or the way I've configured either). Is this a common issue? If not, what information do I need in order to debug this?

@ryanlovett I noticed that you worked on the socket proxy logic - any thoughts?

404 : Page Not Found issue

I have deployed the extension to my Jupyterhub and tried accessing the services , But getting the 404 : page not found ,
I have checked the console, It gave me this error message :
require.js?v=6da8be361b9ee26c5e721e76c6d4afce:900 TypeError: Cannot read property 'scrollHeight' of undefined at 8001:192 at Object.execCb (require.js?v=6da8be361b9ee26c5e721e76c6d4afce:1690) at Module.check (require.js?v=6da8be361b9ee26c5e721e76c6d4afce:865) at Module.<anonymous> (require.js?v=6da8be361b9ee26c5e721e76c6d4afce:1140) at require.js?v=6da8be361b9ee26c5e721e76c6d4afce:131 at require.js?v=6da8be361b9ee26c5e721e76c6d4afce:1190 at each (require.js?v=6da8be361b9ee26c5e721e76c6d4afce:56) at Module.emit (require.js?v=6da8be361b9ee26c5e721e76c6d4afce:1189) at Module.check (require.js?v=6da8be361b9ee26c5e721e76c6d4afce:940) at Module.enable (require.js?v=6da8be361b9ee26c5e721e76c6d4afce:1177)

PS : Thanks in advance
Vidit

Absolute URLs in HTML get 404

I have a site running in jupyterhub environment on a user's notebook that is proxied using jupyter-server-proxy on port 8080. I am facing issues with the absolute path of css and js file links in the html document. To help repro this, I created a toy website with static content, for example:
http://www.tld.com/user/johndoe/proxy/8080/index.html
The index.html code looks as follows:

<html>
<head>
    <link rel="stylesheet" href="/css/style.css">
</head>
<body>
    <h1>Hello Heading</h1>
    <p>Hello Paragraph</p>
    <button>Hello Button</button>
</body>
</html>

The problem comes with resolution of style.css file path as it tries to get it at http://www.tld.com/hub/css/style.css, which is NotFound (404).
Unfortunately, I cannot adjust the 3rd party site's absolute paths. Is there a way to root the absolute paths at http://www.tld.com/user/johndoe/proxy/8080/ instead of http://www.tld.com/hub/? Any help would be highly appreciated.

Release 1.0

I think this is in a reasonable place to release a v1.0.

Primary tasks left to do are:

  • Shore up documentation to be top notch - helping both users and developers understand what they can and can not do.
  • Make any final breaking API changes
  • Document recommendation to use only in containers
  • Move rsession-proxy into this repo
  • Move docs of contrib packages into docs site

JupyterLab-server-proxy GUI Launchers in Notebooks area

The jupyterlab-launcher-extension GUI places icons in an area marked "Notebooks" even though services may not be notebook related e.g. RStudio.

Would it make sense to have allow these icons to be registered in the Other area or a newly minted Service Launches area? Or rename Notebooks something like Notebooks and Applications?

But then, what if you launched a service that was headless, eg a database service with no UI?

It also strikes me that some applications you would only want to start once, although you may want to to close and reopen a window onto them repeatedly without starting and stopping the underlying application each time. This contrasts with a notebook launcher, where you presumably do want to start a new notebook process on each start.

Arising from this, a couple more observations:

  • The notebook UI Running tab lists terminals and notebook started from the launched but not other running services started from the notebook or JupyterLab launcher;
  • Where a service has a JupyterLab GUI icon, and the service is launched once and once only (e.g. RStudio) would it make sense to allow for a two icons, one for 'not running' and the other for 'running'. This would act as an indicator in the JupyterLab context as to whether a service launched from JupyterLab is running or not. (This could be useful e.g. in the case of a service such as a database server started from the JupyterLab UI.)

main.js not found 404 error

Hi , I have followed all the steps mentioned in installing this server extension , But when I am launching this , I am getting 404 error of
require.min.js:1 GET http://localhost:8888/main.js net::ERR_ABORTED 404 (Not Found) req.load @ require.min.js:1 load @ require.min.js:1 load @ require.min.js:1 fetch @ require.min.js:1 check @ require.min.js:1 enable @ require.min.js:1 enable @ require.min.js:1 (anonymous) @ require.min.js:1 (anonymous) @ require.min.js:1 each @ require.min.js:1 enable @ require.min.js:1 init @ require.min.js:1 (anonymous) @ require.min.js:1 setTimeout (async) req.nextTick @ require.min.js:1 o @ require.min.js:1 configure @ require.min.js:1 requirejs @ require.min.js:1 (anonymous) @ require.min.js:1 (anonymous) @ require.min.js:1 require.min.js:1 Uncaught Error: Script error for "main" http://requirejs.org/docs/errors.html#scripterror at makeError (require.min.js:1) at HTMLScriptElement.onScriptError (require.min.js:1)

Can anyone help me to resolve this issue .
I have a file that is running on port 8866. I am trying to open it in the way that is decribed in the README.md file but getting this error again and again..

PS : I checked the source code . In the file https://github.com/jupyterhub/nbserverproxy/blob/master/nbserverproxy/__init__.py , in function _jupyter_nbextension_paths() from where it is requiring nbserverproxy/main .. I guess this line of code is giving me error of main.js not found . ?

Thanks in advance
Vidit

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.