Giter Site home page Giter Site logo

jupyter_http_over_ws's Introduction

Jupyter HTTP-over-WebSocket

This Jupyter server extension allows running Jupyter notebooks that use a WebSocket to proxy HTTP traffic. Browsers do not allow cross-domain communication to localhost via HTTP, but do support cross-domain communication to localhost via WebSocket.

Installation and Setup

Run the following commands in a shell:

pip install jupyter_http_over_ws
# Optional: Install the extension to run every time the notebook server starts.
# Adds a /http_over_websocket endpoint to the Tornado notebook server.
jupyter serverextension enable --py jupyter_http_over_ws

Usage

New notebook servers are started normally, though you will need to set a flag to explicitly trust WebSocket connections from the host communicating via HTTP-over-WebSocket.

jupyter notebook \
  --NotebookApp.allow_origin='https://www.example.com' \
  --port=8081

Note: Before requests will be accepted by your Jupyter notebook, make sure to open the browser window specified in the command-line when the notebook server starts up. This will set an auth cookie that is required for allowing requests (http://jupyter-notebook.readthedocs.io/en/stable/security.html).

Troubleshooting

Receiving 403 errors when attempting connection

If the auth cookie isn't present when a connection is attempted, you may see a 403 error. To help prevent these types of issues, consider starting your Jupyter server using the --no-browser flag and open the provided link that appears in the terminal from the same browser that you would like to connect from:

jupyter notebook \
  --NotebookApp.allow_origin='https://www.example.com' \
  --port=8081
  --no-browser

If you still see issues, consider retrying the above steps from an incognito window which will prevent issues related to browser extensions.

Uninstallation

The jupyter server extension can be disabled and removed by running the following commands in a shell:

jupyter serverextension disable --py jupyter_http_over_ws
pip uninstall jupyter_http_over_ws

Contributing

If you have a problem, or see something that could be improved, please file an issue. However, we don't have the bandwidth to support review of external contributions, and we don't want user PRs to languish, so we aren't accepting any external contributions right now.

jupyter_http_over_ws's People

Contributors

blois avatar colaboratory-team avatar craigcitro avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

jupyter_http_over_ws's Issues

Irregular access time to datasets in subgroups (h5py) from colab

Hi, I am a bit disorientated, I am using h5py to read datasets from a hierarchical multi group h5 file

First I implemented a NN using jupyter-notebook on google colab, then I decided to move most of the stuff inside custom libraries and only run a command with arguments from jupyter notebook in colab but the run was much slower!

In both cases, the library reading the datasets on the hdf5 files are custom libraries (.py files called from jupyter)

The only difference is that in one case I loop through each epoch and batch from jupyter, in the new (and much slower implementation) I call custom libraries for looping through epochs and batches, that’s all.

In the last case, the code is much slower. I tracked carefully the origin of the delay and it seems to be when reading and managing datasets in h5py

For instance, the following code

def _get_signal_window(self, with_labels=False):
    if (self.get_number_of_avail_windows() == 0):
        self._reset_random_wind()

    sample = self._get_sample()
    Cnp = sample[0]
    Duration = sample[1]
    Dnp = sample[2]
    window_number = sample[3]
    # >>>>>>>>>>>>> HERE IS THE DIFFERENCE
    dset = self.File['Cnp_' + str(Cnp+1) + '/Duration_' + str(Duration+1) + '/Dnp_' + str(Dnp+1) + '/data']
    assert dset.shape[1] % self.length == 0
    samples_per_second = int(dset.shape[1] / self.length)
    samples_per_window = int(samples_per_second * self.window)
    begin = window_number * samples_per_window
    end = begin + samples_per_window
    time_window = torch.Tensor(dset[0,begin:end]).to(self.device)
    clean_signal = torch.Tensor(dset[1,begin:end]).to(self.device)
    noisy_signal = torch.Tensor(dset[2,begin:end]).to(self.device)

    if with_labels:
        starts, widths, amplitudes, categories, number_of_pulses, average_width, average_amplitude = self._get_labels(time_window, Cnp, Duration, Dnp)
    # >>>>>>>>>>>>> HERE IS THE DIFFERENCE
        return time_window, clean_signal, noisy_signal, starts, widths, amplitudes, categories, number_of_pulses, average_width, average_amplitude
    else:
        return time_window, clean_signal, noisy_signal

runs at least 6 times faster when all the code in in jupyter than when I use custom libraries (remember this code above is in a custom library in both cases)

The code between the tags is the one which makes the difference and is identical in both cases

I can share repo link and jupyter files too

What is happening here?

Thanks!!!

Connection to google colab issue.

I am trying to run a basic code on colab but I am not able to connect to any runtime. It just tries to allocate but never allocates one. It's not just on my chrome but it's the same in all my browsers on my system. I don't what to do this issue. I am not able to load the stadia and the cannot use the google docs as it asks me to reload.

Can anyone help me with this problem?

Screenshot (8)

Connection to 'Colaboratory' issues

Followed instructions as described here. Jupyter works on localhost without issues. I open the firewall to allow TCP over port 9090 and issued this

jupyter notebook --NotebookApp.allow_origin='https://colab.research.google.com' --port=9090

and tried to reconnect in codelab, it fails and this is what is seen on the jupyter terminal.

[C 16:36:43.917 NotebookApp]

    Copy/paste this URL into your browser when you connect for the first time,
    to login with a token:
        http://localhost:9090/?token=xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
[I 16:36:44.072 NotebookApp] Accepting one-time-token-authenticated connection from ::1
[W 16:38:04.056 NotebookApp] Forbidden
[W 16:38:12.181 NotebookApp] 403 GET /api/kernelspecs (::1) 8124.95ms referer=None

Any help appreciated!

Tricky installation on Compute Engine VMs leads to unstable Colab notebooks

Hello Colab team!

I'm having a few issues since the Colab update when I'm trying to connect a Compute Engine VM as backend.

pip install --upgrade jupyter_http_over_ws>=0.0.7can only be ran as sudo. This makes is so that jupyter serverextension enable --py jupyter_http_over_ws and the Jupyter kernel init (using --allow-root) have to be ran as sudo too because if you don't, they're using the regular user's config (so version 0.0.6, which doesn't work with Colab...) and 0.0.7 is only installed for root.
Colab is then able to connect to the VM, but this seems to lead to some weird issues where f-string are considered syntax errors despite my Python version being 3.5.3, pubsub_v1 not found after pip install --upgrade google-cloud-pubsub, etc... for example.

The worst is, I tried with another, simpler and cpu-only VM I just created and it works perfectly. I don't understand this seemingly random behavior

Am I missing something obvious or is it a known issue with Compte Engine VMs?

Thank you!

tl;dr: some Compute Engine VMs + jupyter_http_over_ws + Colab = weird errors

I rerun everything without panicking and it worked just fine... I think it was just caused by trying to jump steps and some installs I did in the notebook that broke some dependencies. Point taken.

enum34 is being installed on Python 3, sometimes breaking other package installs

  1. enum34 isn't needed on Python 3.6 or later, and is incompatible (it doesn't include enum.IntFlag).
  2. The decision whether to install it is made via an if statement in setup.py.
  3. The currently released universal wheel are generated by Python 2, so enum34 is always installed.
  4. Code that relies on enum.IntFlag can then fail to install, or possibly even run.

Solving this:

  1. The correct way to have conditional installs is via environment markers: https://www.python.org/dev/peps/pep-0508/#environment-markers
  2. Alternatively, you could have a wheel for Python 2 and a wheel for Python 3.

Are tensorboard event files saved differently on colab from saving them on local machines?

I recently switched to running my python tensorflow code on colab from my local machine and used the same code to run the models(all I did for the entire code was change one directory path - i.e. os.getcwd() ) so I wanted to check if colab saves tensorboard files a bit differently before I chase my tail.
It seems that tf.summary.FileWriter creates a folder for the event files but the event files themselves don't appear in my google drive.

This is how I save it (rando is a random number so I am definitely able to find that distinct folder) :

train_writer = tf.summary.FileWriter('{}/train_{}'.format(fileprefix,rando), graph=graph)

Colab Suddenly stopped connecting to local runtime in jupyter

A few hours ago I was able to connect my Colab notebook to my local runtime using jupyter. I am now suddenly unable to despite reinstalling jupyter_https_over_ws, and running --NotebookApp.allow_origin, -- no-browser and a variety of other flags.

Automatically enable notebook serverextension via setup.py config

many other jupyter notebook plugins (like e.g. jupyterlab or jupyterlab-git) automatically enable themselves via the mode described here: https://jupyter-notebook.readthedocs.io/en/stable/examples/Notebook/Distributing%20Jupyter%20Extensions%20as%20Python%20Packages.html#Automatically-enabling-a-server-extension-and-nbextension

jupyterlab for example, does this here: https://github.com/jupyterlab/jupyterlab/blob/master/setup.py#L46

it'd be great if jupyter_http_over_ws could do the same, which would omit the need to explicitly run jupyter serverextension enable --py jupyter_http_over_ws after installing.

Sporadically (but often) getting 404 GET error "/api/contents/var/colab/app.log (127.0.0.1): No such file or directory: var/colab/app.log"

Hi!

I have a problem, my Colab notebook keeps crashing when connected to the local runtime (over two SSH tunnels, if it makes the difference) with the error 404 GET /api/contents/var/colab/app.log (127.0.0.1): No such file or directory: var/colab/app.log. I pasted the cmd output below. Sometimes it happens immediately when I try to run the first cell in the notebook after successful connection to the local runtime and sometimes after some time mid-training (using TF v2). My OS is Ubuntu 18.04.

Do you know this problem? Any ideas on how to fix it?

[I 09:13:30.945 NotebookApp] Jupyter Notebook 6.4.6 is running at:
[I 09:13:30.945 NotebookApp] http://localhost:8888/?token=e848e3b8ff73d61e99bbc4ea2e575ff4157442406978cbfa
[I 09:13:30.945 NotebookApp]  or http://127.0.0.1:8888/?token=e848e3b8ff73d61e99bbc4ea2e575ff4157442406978cbfa
[I 09:13:30.945 NotebookApp] Use Control-C to stop this server and shut down all kernels (twice to skip confirmation).
[C 09:13:30.948 NotebookApp]

    To access the notebook, open this file in a browser:
        file:///home/pjanuszewski/.local/share/jupyter/runtime/nbserver-8792-open.html
    Or copy and paste one of these URLs:
        http://localhost:8888/?token=e848e3b8ff73d61e99bbc4ea2e575ff4157442406978cbfa
     or http://127.0.0.1:8888/?token=e848e3b8ff73d61e99bbc4ea2e575ff4157442406978cbfa
[I 09:13:50.656 NotebookApp] 302 GET /?token=e848e3b8ff73d61e99bbc4ea2e575ff4157442406978cbfa (127.0.0.1) 0.910000ms
[I 09:13:50.865 NotebookApp] 302 GET /?token=e848e3b8ff73d61e99bbc4ea2e575ff4157442406978cbfa (127.0.0.1) 0.880000ms
[I 09:13:50.905 NotebookApp] Kernel started: f4a78d66-276f-4b34-a18a-88722e9d992e, name: python3
[I 09:13:50.952 NotebookApp] 302 GET /?token=e848e3b8ff73d61e99bbc4ea2e575ff4157442406978cbfa (127.0.0.1) 0.930000ms
[I 09:13:50.963 NotebookApp] proxying WebSocket connection to: ws://localhost:8888/api/kernels/f4a78d66-276f-4b34-a18a-88722e9d992e/channels?session_id=cad2755a9fd5414ec4dd128190228efc&jupyter_http_over_ws_auth_url=http%3A%2F%2Flocalhost%3A8888%2F%3Ftoken%3De848e3b8ff73d61e99bbc4ea2e575ff4157442406978cbfa
[I 09:13:54.878 NotebookApp] 302 GET /?token=e848e3b8ff73d61e99bbc4ea2e575ff4157442406978cbfa (127.0.0.1) 0.900000ms
[I 09:13:56.594 NotebookApp] 302 GET /?token=e848e3b8ff73d61e99bbc4ea2e575ff4157442406978cbfa (127.0.0.1) 0.840000ms
[I 09:13:57.777 NotebookApp] 302 GET /?token=e848e3b8ff73d61e99bbc4ea2e575ff4157442406978cbfa (127.0.0.1) 0.810000ms
[I 09:14:11.905 NotebookApp] KernelRestarter: restarting kernel (1/5), keep random ports
WARNING:root:kernel f4a78d66-276f-4b34-a18a-88722e9d992e restarted
[I 09:14:12.374 NotebookApp] 302 GET /?token=e848e3b8ff73d61e99bbc4ea2e575ff4157442406978cbfa (127.0.0.1) 1.430000ms
[I 09:14:12.394 NotebookApp] 302 GET /?token=e848e3b8ff73d61e99bbc4ea2e575ff4157442406978cbfa (127.0.0.1) 0.490000ms
[W 09:14:12.401 NotebookApp] 404 GET /api/contents/var/colab/app.log (127.0.0.1): No such file or directory: var/colab/app.log
[W 09:14:12.401 NotebookApp] No such file or directory: var/colab/app.log
[W 09:14:12.401 NotebookApp] 404 GET /api/contents/var/colab/app.log (127.0.0.1) 0.840000ms referer=None
[I 09:14:12.406 NotebookApp] 302 GET /?token=e848e3b8ff73d61e99bbc4ea2e575ff4157442406978cbfa (127.0.0.1) 0.350000ms
[W 09:14:12.412 NotebookApp] 404 GET /api/contents/var/colab/ooms (127.0.0.1): No such file or directory: var/colab/ooms
[W 09:14:12.412 NotebookApp] No such file or directory: var/colab/ooms
[W 09:14:12.412 NotebookApp] 404 GET /api/contents/var/colab/ooms (127.0.0.1) 0.590000ms referer=None
[I 09:14:15.560 NotebookApp] 302 GET /?token=e848e3b8ff73d61e99bbc4ea2e575ff4157442406978cbfa (127.0.0.1) 0.800000ms
[W 09:14:15.573 NotebookApp] 404 GET /api/contents/var/colab/app.log (127.0.0.1): No such file or directory: var/colab/app.log
[W 09:14:15.574 NotebookApp] No such file or directory: var/colab/app.log
[W 09:14:15.574 NotebookApp] 404 GET /api/contents/var/colab/app.log (127.0.0.1) 1.180000ms referer=None
[I 09:15:18.702 NotebookApp] 302 GET /?token=e848e3b8ff73d61e99bbc4ea2e575ff4157442406978cbfa (127.0.0.1) 0.800000ms
[I 09:15:23.921 NotebookApp] KernelRestarter: restarting kernel (1/5), keep random ports

Thanks!
Piotr

0.0.3 doesn't work with latest tornado (6.0.0)

'Tornado' released '6.0.0' on 01/03/2019, making 'jupyter_http_over_ws' raising an ImportError when invoking:

jupyter serverextension enable --py jupyter_http_over_ws

https://github.com/tornadoweb/tornado/issues/2383
 File "<site-packages\jupyter_http_over_ws\handlers.py", line 28, in <module>
    from tornado import stack_context
ImportError: cannot import name 'stack_context'

Using tornado==5.1 makes it work.

api/session names are URL encoded and display incorrectly in a jupyter server

When connecting colab to a Jupyter server the frontend application sends URL encoded parameters to /api/sessions which causes the API to return hard to read names. While this is coming from the frontend, if the backend needs the URL encoding this can be undone in this package when forwarding the API request to the Jupyter server.

This can be seen by creating a notebook in drive with a character which needs to be escaped and looking at the websocket connections that are happening to the handler this package provides.

An example response from /api/sessions after loading such a notebook might be the following:

[
    {
        "id": "fa293a7e-afb2-4fca-92c2-ffb9e548e052",
        "kernel": {
            "connections": 0,
            "execution_state": "idle",
            "id": "0ece430d-e288-4907-973f-d58a6d8010ff",
            "last_activity": "2021-02-17T06:31:48.663580Z",
            "name": "Python"
        },
        "name": "%5Btest%5D%20example%20notebook.ipynb",
        "notebook": {
            "name": "%5Btest%5D%20example%20notebook.ipynb",
            "path": "fileId=?????????????????????????????????"
        },
        "path": "fileId=?????????????????????????????????",
        "type": "notebook"
    }
]

While the expected response would instead include the string [test] example notebook.ipynb.

Seeing a 404 GET

  • I am connected to a remote GPU server from my local machine through SSH port forwarding.
  • My remote GPU machine has a docker instance with exposed ports, and I am running the jupyter notebook inside the docker instance.

I can access the jupyter server on my local machine, but when I try to connect to a local runtime on google collab, I get this error:

404 GET /http_over_websocket?min_version=0.0.7&jupyter_http_over_ws_auth_url=http%3A%2F%2Flocalhost%3A8098%2F%3Ftoken%{TOKEN_NOT_INCLUDED} (172.17.0.1) 15.77ms referer=None

Can't force usage of specific kernel

I apologize in advance for the vague issue, but I can't seem to change the jupyter kernel used by my local runtime, I've tried setting the default kernel in the config, and running jupyter from within a conda virtual environment, if anyone's aware of a way to do this, it would be greatly appreciated.

Connection refused

My issue may be similar to @debsahu on #1.

I followed instructions as described here.

I am attempting to get colaboratory working from a browser on my laptop using a different computer as the local runtime - let's call this computer "local_server", which happens to have an ip address of 10.1.1.3 on my local LAN. The browser that I am using is the chrome (Version 67.0.3396.99 (Official Build) (64-bit))

  • Colaboratory works no problems with hosted runtime.
  • Colaboratory works no problems with local runtime as my laptop (on my laptop I have jupyter running on port 8888).
  • I have jupyter set up on my local_server, also on port 8888
  • Colaboratory works no problem with local runtime when I am on using chrome on the local_server machine

I now want to work on colaboratory on my laptop using local_server as the local runtime

  • To get ssh forwarding is working, I run this command on my laptop:
ssh -L 8889:10.1.1.3:8888 10.1.1.3
  • In the jupyter_notebook_config.py file on the local_server machine I set
c.NotebookApp.ip = '10.1.1.3'

Now I can go to http://localhost:8889/tree on my laptop and jupyter works but colaboratory does not. I get the following error from jupyter:

[E 10:22:43.257 NotebookApp] Uncaught error when proxying request
    Traceback (most recent call last):
      File "/usr/local/lib/python3.4/dist-packages/jupyter_http_over_ws/handlers.py", line 151, in on_message
        response.rethrow()
      File "/usr/local/lib/python3.4/dist-packages/tornado/httpclient.py", line 652, in rethrow
        raise self.error
    ConnectionRefusedError: [Errno 111] Connection refused

What am I doing wrong? Any suggestions would be appreciated. Thanks.

anaconda

How I can install this package in anaconda and using that in anaconda envs?

Auth problem

I have installed both jupyter and its corresponding extension jupyter_http_over_ws. But when I try to run Colab with the local resources I get an error like this:

image

I have also checked that if I avoid authentication with this command:

jupyter notebook --no-browser --allow-root --NotebookApp.allow_origin='https://colab.research.google.com' --NotebookApp.token='' --NotebookApp.disable_check_xsrf=True

Everything works fine but I don't feel comfortable to work like this.
Hope someone can help with this problem.

Webauthentication using python - not working

I am trying to download a csv from a website - the coder works fine in anaconda jupyter notebook . the code is

image
but the same code when I am running on google colab I get the below error

image

Why is this happening?????

NotebookApp version needing upgrade when connecting to local runtime

Please excuse me and direct me to another source for information if there is one (couldn't find anything on this)

I am trying to connect to jupyter notebook on google colab's local runtime and when I try to I get this message:

[E 13:51:13.307 NotebookApp] Rejecting connection: Requested version (0.0.1a2) >
Current version (0.0.1a1). Please upgrade this package.

I was running local runtime jobs perfectly fine this morning but a couple hours later it suddenly stopped working. I don't believe I even touched the machine in between this timeframe.

This was the entire message:

(base) C:\Users\ballcap>jupyter notebook \ --NotebookApp.allow_origin='https://col
ab.research.google.com' \ --port=8889
jupyter_http_over_ws extension initialized. Listening on /http_over_websocket
[I 13:50:46.680 NotebookApp] JupyterLab alpha preview extension loaded from C:\U
sers\2hinc\AppData\Local\Continuum\anaconda3\lib\site-packages\jupyterlab
JupyterLab v0.27.0
Known labextensions:
[I 13:50:46.727 NotebookApp] Running the core application with no additional ext
ensions or settings
[I 13:50:46.758 NotebookApp] Serving notebooks from local directory: C:\
[I 13:50:46.773 NotebookApp] The Jupyter Notebook is running at:
[I 13:50:46.789 NotebookApp] http://localhost:8889/?token=0af42d8c7656bc5a04fea5
0ea6623ae8bb099209d086e31b
[I 13:50:46.805 NotebookApp] Use Control-C to stop this server and shut down all
 kernels (twice to skip confirmation).
[C 13:50:46.836 NotebookApp]

    Copy/paste this URL into your browser when you connect for the first time,
    to login with a token:
        http://localhost:8889/?token=0af42d8c7656bc5a04fea50ea6623ae8bb099209d08
6e31b
[I 13:50:47.007 NotebookApp] Accepting one-time-token-authenticated connection f
rom ::1
[E 13:51:13.307 NotebookApp] Rejecting connection: Requested version (0.0.1a2) >
 Current version (0.0.1a1). Please upgrade this package.

Connecting to local runtime via Google Cloud SDK

Hi,

I am attempting to connect to a VM instance I started from GCP so that I can take advantage of a GPU in Google Colab. I am following the instructions here. I also attempted solutions from the issues raised here and here but they didn't work--I think my problem is somewhat different. I am using the following commands
First in Google Cloud SDK:
gcloud compute ssh --zone "us-central1-a" [instance_name] --project [project_name] -- -L 8889:locahost:8889
Then, in the instance that starts up:
jupyter serverextension enable --py jupyter_http_over_ws
jupyter notebook --NotebookApp.allow_origin="https://colab.research.google.com" --port=8889 --NotebookApp.port_retries=0 --no-browser

After those, my instance generates the url as expected, but when I paste it into my browser it says 'The site can't be reached.' (Connection reset error).
Any help or suggestions would be greatly appreciated!

Jupyter notebook won't connect to kernel

I installed this extension and now my jupyter won't run anything as it can't connect to the kernel. Please advise on how I can troubleshoot/uninstall this extension. Thanks!

Cannot connect when port changes via proxy/ssh

If connecting to a local runtime via port forwarding and the port # is forwarded to a different port then the proxy will fail to connect and will generate the error:

[E 21:14:40.883 NotebookApp] Couldn't attach auth cookies
[E 21:14:59.596 NotebookApp] Uncaught error when proxying request
    Traceback (most recent call last):
      File "/opt/conda/lib/python3.8/site-packages/tornado/tcpclient.py", line 138, in on_connect_done
        stream = future.result()
    tornado.iostream.StreamClosedError: Stream is closed

    During handling of the above exception, another exception occurred:

    Traceback (most recent call last):
      File "/opt/conda/lib/python3.8/site-packages/jupyter_http_over_ws/handlers.py", line 179, in _attach_auth_cookies
        extra_cookies = yield _perform_request_and_extract_cookies(
      File "/opt/conda/lib/python3.8/site-packages/tornado/gen.py", line 762, in run
        value = future.result()
      File "/opt/conda/lib/python3.8/site-packages/tornado/gen.py", line 769, in run
        yielded = self.gen.throw(*exc_info)  # type: ignore
      File "/opt/conda/lib/python3.8/site-packages/jupyter_http_over_ws/handlers.py", line 533, in _perform_request_and_extract_cookies
        response = yield http_client.fetch(proxy_request, raise_error=False)
      File "/opt/conda/lib/python3.8/site-packages/tornado/gen.py", line 762, in run
        value = future.result()
      File "/opt/conda/lib/python3.8/site-packages/tornado/iostream.py", line 1205, in connect
        self.socket.connect(address)
    OSError: [Errno 99] Cannot assign requested address

The connection works if I ensure that the port is not changed during the forwarding.

Blocking Cross Origin API request

Hi, I'm having trouble connecting to local runtime. This is the error I am getting:

`(base) C:\Users\harol>jupyter notebook \ -- 
 NotebookApp.allow_origin='https://colab.research.google.com'\ --port=8888 \ -- 
 NotebookApp.port_retries=0 --no-browser`

jupyter_http_over_ws extension initialized. Listening on /http_over_websocket
[I 01:29:45.989 NotebookApp] JupyterLab extension loaded from 
C:\Users\harol\anaconda3\lib\site-packages\jupyterlab
[I 01:29:45.989 NotebookApp] JupyterLab application directory is 
C:\Users\harol\anaconda3\share\jupyter\lab
[I 01:29:46.005 NotebookApp] Serving notebooks from local directory: C:\
[I 01:29:46.006 NotebookApp] The Jupyter 

I've followed the instructions in setting it up and browsed all of the issues if I have similar issues.

Any suggestions are greatly appreciated. Thanks

IOPub data rate exceeded.

How do I rectify this in Google Colab?

The notebook server will temporarily stop sending output to the client in order to avoid crashing it.
To change this limit, set the config variable
--NotebookApp.iopub_data_rate_limit.

Current values:
NotebookApp.iopub_data_rate_limit=1000000.0 (bytes/sec)
NotebookApp.rate_limit_window=3.0 (secs)

0.0.7 make impossible to connect collab to jupyter through a proxy.

Everything in the title,
Last update to 0.0.7 make impossible to connect collab to a jupyter instance through a proxy.

[Collab] --> [Local proxy ws://] --> -LAN- --> [GPU server running a jupyter instance.]

The following trace is observable on the jupyter instance :

[E 19:36:25.303 NotebookApp] Uncaught error when proxying request
    Traceback (most recent call last):
      File "/usr/local/lib/python2.7/dist-packages/jupyter_http_over_ws/handlers.py", line 178, in _attach_auth_cookies
        _validate_same_domain(self.request, parsed_auth_url)
      File "/usr/local/lib/python2.7/dist-packages/jupyter_http_over_ws/handlers.py", line 503, in _validate_same_domain
        handler_domain.geturl(), url.geturl()))
    ValueError: Invalid cross-domain request from http://192.168.1.119:8888 to http://localhost:8899/
[E 19:36:25.304 NotebookApp] Couldn't attach auth cookies

The port :8899 is the localhost computer where collab is launched. (my workstation)
This is also a proxy redirecting the wsocket request to an internal lan server serving jupyter : 192.168.1.119 on the port 8888.

Jupyter is launch with the following options :

jupyter notebook --NotebookApp.allow_origin='' --port=8888 --ip='0.0.0.0' --NotebookApp.port_retries=0 --NotebookApp.token='' --NotebookApp.password='' --NotebookApp.disable_check_xsrf=True  --allow-root --no-browser

Was working on 0.0.6. Any advices ?

Not able to connect jupyter local machine to google colab

I am trying to connect it with google colab using this command

jupyter notebook
--NotebookApp.allow_origin='https://colab.research.google.com'
--port=8081
--NotebookApp.port_retries=0

After this local machine is running on the same port but not able to connect it with colab. I tried restarting my notebook and generating new tokens disabling all the firewalls. Then tried ngrok for tunneling the token URL but i stuck in between where to use that tunneling URL. Then I tried changing the jupyter_notebook_config.py file by adding c.NotebookApp.allow_remote_access = True as per bing said, but then aslo it was not connecting.

Can anyone please help me with this or can suggest any other way where can I use my localhost on Internet for changing and debuggning .ipynb file located on local server.

Error loading server extension —

I'm trying to connect to a local runtime, but I'm getting the following error: Error loading server extension —.

jupyter_http_over_ws extension initialized. Listening on /http_over_websocket
[I 00:48:59.298 NotebookApp] JupyterLab extension loaded from C:\Users\Issstezac1\anaconda3\lib\site-pack
ages\jupyterlab
[I 00:48:59.298 NotebookApp] JupyterLab application directory is C:\Users\Issstezac1\anaconda3\share\jupy
ter\lab
[W 00:48:59.314 NotebookApp] Error loading server extension —
    Traceback (most recent call last):
      File "C:\Users\Issstezac1\anaconda3\lib\site-packages\notebook\notebookapp.py", line 1670, in init_
server_extensions
        mod = importlib.import_module(modulename)
      File "C:\Users\Issstezac1\anaconda3\lib\importlib\__init__.py", line 127, in import_module
        return _bootstrap._gcd_import(name[level:], package, level)
      File "<frozen importlib._bootstrap>", line 1006, in _gcd_import
      File "<frozen importlib._bootstrap>", line 983, in _find_and_load
      File "<frozen importlib._bootstrap>", line 965, in _find_and_load_unlocked
    ModuleNotFoundError: No module named '—'
[I 00:48:59.314 NotebookApp] Serving notebooks from local directory: C:\
[I 00:48:59.314 NotebookApp] The Jupyter Notebook is running at:
[I 00:48:59.330 NotebookApp] http://localhost:8888/?token=325c5ffe303c1c7b54b2639d3c6a256550411126dbfb974
5
[I 00:48:59.330 NotebookApp]  or http://127.0.0.1:8888/?token=325c5ffe303c1c7b54b2639d3c6a256550411126dbf
b9745
[I 00:48:59.330 NotebookApp] Use Control-C to stop this server and shut down all kernels (twice to skip c
onfirmation).
[C 00:48:59.392 NotebookApp]

    To access the notebook, open this file in a browser:
        file:///C:/Users/Issstezac1/AppData/Roaming/jupyter/runtime/nbserver-2716-open.html
    Or copy and paste one of these URLs:
        http://localhost:8888/?token=325c5ffe303c1c7b54b2639d3c6a256550411126dbfb9745
     or http://127.0.0.1:8888/?token=325c5ffe303c1c7b54b2639d3c6a256550411126dbfb9745
[W 00:50:09.451 NotebookApp] Blocking Cross Origin API request for /http_over_websocket.  Origin: https:/
/colab.research.google.com, Host: localhost:8888
[W 00:50:09.451 NotebookApp] 403 GET /http_over_websocket?min_version=0.0.7&jupyter_http_over_ws_auth_url
=http%3A%2F%2Flocalhost%3A8888%2F%3Ftoken%3D325c5ffe303c1c7b54b2639d3c6a256550411126dbfb9745 (::1) 0.00ms
 referer=None
[W 00:50:12.951 NotebookApp] Blocking Cross Origin API request for /http_over_websocket/diagnose.  Origin
: https://colab.research.google.com, Host: localhost:8888
[W 00:50:12.951 NotebookApp] 403 GET /http_over_websocket/diagnose?min_version=0.0.7 (::1) 0.00ms referer
=None

I followed every step from the documentation but I still can not connect locally. Currently, I'm using Mozilla 77.0.1 (64-bit) version. Therefore, I enabled network.websocket.allowInsecureFromHTTPS as mentioned in the specific browser settings. Even I tried the --no-browser flag as the Jupyter HTTP-over-WebSocket troubleshooting suggest. What am I doing wrong?

Any further help will be appreciated.

Firewall blocking connecting colab to local runtime

I cannot connect colab to my local runtime using the instructions provided by google. However, when I turn off my firewall it works perfectly fine. Can you tell my with protocol or port should I allow through my firewall to make the connection? I don't want to turn off my firewall or not use jupyter token due to security threads that doing those poses.

Error while enabling jupyter_http_over_ws extension

I'm following the instructions described here.

The very second step to enable jupyter_http_over_ws jupyter extension gives following error:

Traceback (most recent call last):
  File "/usr/local/bin/jupyter-serverextension", line 8, in <module>
    sys.exit(main())
  File "/usr/local/lib/python2.7/dist-packages/jupyter_core/application.py", line 267, in launch_instance
    return super(JupyterApp, cls).launch_instance(argv=argv, **kwargs)
  File "/usr/local/lib/python2.7/dist-packages/traitlets/config/application.py", line 658, in launch_instance
    app.start()
  File "/usr/local/lib/python2.7/dist-packages/notebook/serverextensions.py", line 293, in start
    super(ServerExtensionApp, self).start()
  File "/usr/local/lib/python2.7/dist-packages/jupyter_core/application.py", line 256, in start
    self.subapp.start()
  File "/usr/local/lib/python2.7/dist-packages/notebook/serverextensions.py", line 210, in start
    self.toggle_server_extension_python(arg)
  File "/usr/local/lib/python2.7/dist-packages/notebook/serverextensions.py", line 199, in toggle_server_extension_python
    m, server_exts = _get_server_extension_metadata(package)
  File "/usr/local/lib/python2.7/dist-packages/notebook/serverextensions.py", line 327, in _get_server_extension_metadata
    m = import_item(module)
  File "/usr/local/lib/python2.7/dist-packages/traitlets/utils/importstring.py", line 42, in import_item
    return __import__(parts[0])
ImportError: No module named jupyter_http_over_ws

Tried installing tornado==5.1 as mentioned in #5

Also tried upgrading the package:
pip install --upgrade jupyter_http_over_ws

COLAB-LOCAL: Error in parts[[i]] : subscript out of bounds

** CAN'T CONNECT to ColabLocal

System:
OS: Ubuntu 18.04
Install script

apt -y install python3 python3-pip
ln -s $(which python3) /usr/local/bin/python
pip --no-cache-dir install --upgrade pip
pip3 --no-cache-dir install --upgrade jupyter
pip3 --no-cache-dir install --upgrade jupyter_http_over_ws
jupyter serverextension enable --py jupyter_http_over_ws

# Load Jupyter Server
jupyter notebook --NotebookApp.allow_origin='https://colab.research.google.com' --port=8081 \
--allow-root --NotebookApp.port_retries=0 --NotebookApp.token='' --NotebookApp.password='' \
--NotebookApp.disable_check_xsrf=True --ip="*" --no-browser

#12 I check this, but this still cant connect.

[I 18:18:13.963 NotebookApp] KernelRestarter: restarting kernel (1/5), keep random ports
[E 18:18:24.998 NotebookApp] Uncaught error when proxying request
Traceback (most recent call last):
File "/opt/ve/lib/python3.6/site-packages/jupyter_http_over_ws/handlers.py", line 446, in _get_proxied_ws
client = yield self._proxied_ws_future
File "/opt/ve/lib/python3.6/site-packages/tornado/gen.py", line 735, in run
value = future.result()
File "/opt/ve/lib/python3.6/site-packages/jupyter_http_over_ws/handlers.py", line 446, in _get_proxied_ws
client = yield self._proxied_ws_future
File "/opt/ve/lib/python3.6/site-packages/tornado/gen.py", line 735, in run
value = future.result()
File "/opt/ve/lib/python3.6/site-packages/jupyter_http_over_ws/handlers.py", line 446, in _get_proxied_ws
client = yield self._proxied_ws_future
File "/opt/ve/lib/python3.6/site-packages/tornado/gen.py", line 735, in run
value = future.result()
File "/opt/ve/lib/python3.6/site-packages/tornado/gen.py", line 742, in run
yielded = self.gen.throw(*exc_info) # type: ignore
File "/opt/ve/lib/python3.6/site-packages/jupyter_http_over_ws/handlers.py", line 437, in _on_open
proxy_request, on_message_callback=self._on_proxied_message)
File "/opt/ve/lib/python3.6/site-packages/tornado/gen.py", line 735, in run
value = future.result()
tornado.simple_httpclient.HTTPTimeoutError: Timeout during request

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.